> But by the way, if you read the issues from a while back
> you'll see that AlexeyAB's fork basically scooped them,
> hence the version bump.
Yeah that sucks, but it does mean they should have done some proper comparison with YOLOv4.
> This took a while, probably because there is actually very
> little documentation for Yolov3 and there was confusion
> over what the loss function actually ought to be. The
> darknet repo is totally uncommented C with lots of single
> letter variable names. AlexeyAB is a Saint.
Maybe I'm alone, but I found it quite readable. You can quite reasonably understand the source in a day.
> The v4 release was also quite contentious.
Kind of, I am personally still evaluating this network fully.
> I disagree on your second point though. Demanding a paper
> when the author says "we will later" is hardly a blow off.
Checkout the translation of "you can you up,no can no bb" (see other comments).
> And before we knock Glenn for this, as far as I know, he's
> running a business, not a research group.
I understand, but this seems very unethical to take the name of an open source framework and network that publishes it's improvements in some form, bump the version number and then claim it's faster without actually doing an apples to apples test. It would have seem appropriate to contact the person who carried the torch after pjreddie stepped down from the project.
On the whole I agree about darknet being readable, it seemed well written and I've found it useful to grok how training libraries are written. I think they've moved to other backends now for the main computation though.
But.. it was still very much undocumented (and there were details missing from the paper). I think this almost certainly led to some slowdown in porting to other frameworks. And the fact its written in C has probably limited how much people are willing to contribute to the project.
> Checkout the translation of "you can you up,no can no bb" (see other comments).
That's from an 11 day old github account with no history, not Ultralytics as far as I know.
> Kind of, I am personally still evaluating this network fully.
Contention referring to the community response rather than the performance of the model itself.
> you'll see that AlexeyAB's fork basically scooped them,
> hence the version bump.
Yeah that sucks, but it does mean they should have done some proper comparison with YOLOv4.
> This took a while, probably because there is actually very
> little documentation for Yolov3 and there was confusion
> over what the loss function actually ought to be. The
> darknet repo is totally uncommented C with lots of single
> letter variable names. AlexeyAB is a Saint.
Maybe I'm alone, but I found it quite readable. You can quite reasonably understand the source in a day.
> The v4 release was also quite contentious.
Kind of, I am personally still evaluating this network fully.
> I disagree on your second point though. Demanding a paper
> when the author says "we will later" is hardly a blow off.
Checkout the translation of "you can you up,no can no bb" (see other comments).
> And before we knock Glenn for this, as far as I know, he's
> running a business, not a research group.
I understand, but this seems very unethical to take the name of an open source framework and network that publishes it's improvements in some form, bump the version number and then claim it's faster without actually doing an apples to apples test. It would have seem appropriate to contact the person who carried the torch after pjreddie stepped down from the project.