Git Product home page Git Product logo

Comments (7)

alvestrand avatar alvestrand commented on May 5, 2024

I kind of expected this, because it always arises when discussing codecs. The settings were taken from previous suggestions by x264 proponents.

Please add the contact you wish to be contacted for consultation.

from compare-codecs.

PaulWilkins avatar PaulWilkins commented on May 5, 2024

I agree kierank, all settings need to be validated and subject to adjustment and re-run. There are also problems with the VP9 settings though.

For VC / real time any use of frame should not be allowed. Threading is fine provided it is within frame and not across multiple frames. Also in all the tests for a given condition, command line parameters need to be consistent for all the clips and all points in a graph. This is not the case a the moment.

I notice some cases where two points in the same graph have different command lines and this needs to be addressed.

As things stand I think this can only be seen as a place holder for future work and comment and don't think that the results (as of yet) are not that meaningful.

from compare-codecs.

alvestrand avatar alvestrand commented on May 5, 2024

The current numbers and graphs are pointed at "what are the best numbers that can be achieved by this codec for these test conditions".
It makes sense to do other graphs that go after "what are the numbers that result from holding a certain configuration largely constant across multiple tests", but that's not what the current graphs are looking for.

from compare-codecs.

PaulWilkins avatar PaulWilkins commented on May 5, 2024

I certainly think it makes sense to say if a codec could not achieve the test criteria (i.e for a given point it was too slow or it did not comply with the required buffering constraints) but otherwise think point to point tweaks of the basic parameters are problematic.

For example changes to the real time cpu-speed setting in VP9 will only be valid for the particular machine you ran on on a particular day. A few days later with a new optimization patch or run on a slightly faster or slower machine and the results will change RADICALLY. Also, I noticed a couple of cases where more basic differences such as one point vbr and the other cbr were evident.

In the 264 case for example I noticed some real time points using lookahead 30 and others in the same graph lookahead 60. I am not sure the exact impact of this, other than look-ahead not being sensible for live / VC (though fine for a near live stream).

I would prefer to see an agreed command line for a test case for each codec and then the best case results (with failures points highlighted if need be for real time... but in that case the detailed hardware specification for the machine used to test needs to be given).

I any real world case (like YouTube) you have to come up with a set of settings to use. You cant change them on a point by point and clip by clip basis.

If we have a command line for a particular use case interested parties can comment or suggest improvements, and then hopefully reach consensus. If every point on every clip is hand tuned I think this will be very hard to achieve.

I also think we should extend the test set and perhaps have a different test set for the real time VC case. Just my $0.02 wirth.

from compare-codecs.

alvestrand avatar alvestrand commented on May 5, 2024

I have now implemented (not yet committed) an "information" sub-page that will give you one graph for each configuration used in the overall comparision, and some visualization help (choice to graph by score or CPU time used rather than by PSNR) that allows the viewer to understand why a particular configuration was considered "best" at a given rate target. This might help.

from compare-codecs.

alvestrand avatar alvestrand commented on May 5, 2024

I have now added a single-configuration graph set as the first result in the lists - PR #26.
Will close this bug once that PR lands.

from compare-codecs.

alvestrand avatar alvestrand commented on May 5, 2024

PR landed. Closing bug.

from compare-codecs.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.