VideoHelp Forum
+ Reply to Thread
Results 1 to 23 of 23
Thread
  1. Member
    Join Date
    Jan 2012
    Location
    Budapest
    Search Comp PM
    Feel free to write comments!


    Click image for larger version

Name:	unnamed.png
Views:	291
Size:	94.9 KB
ID:	34542
    Last edited by Stears555; 18th Nov 2015 at 07:54.
    Quote Quote  
  2. Member
    Join Date
    Jan 2012
    Location
    Budapest
    Search Comp PM
    As I said, HEVC was not a breaktrough
    Quote Quote  
  3. I read through the pdf until my head started to hurt, but I think the gist of it was they tested a bunch of encoders using two different PCs (desktop and server) and they ran three different tests on each, supposedly comparing the bitrate each encoder required to achieve a fixed quality.
    One test had no speed restrictions, the second required a minimum encoding speed (although different for desktop and server) and the third required an even higher minimum speed, so in the case of x265 (and x264) the encoder's speed preset needed to be changed accordingly.
    It appears x265 was considered the best (in respect to bitrate) when there was no minimum speed requirement and still the best when the minimum speed requirement was fairly low and dropped to 3rd or 4th place when the minimum speed requirement was higher. It was interesting to see x264 was used as a reference point, and wasn't doing too badly, not all that far behind the better x265 encoders and better than some.

    One thing I'm still not clear on is whether 2 pass encoding was used for every codec or when it was, as was the case for x264 and x265, if the minimum speed requirement included the first pass or if it was just for the second pass.

    I'm not familiar with many of the encoders in the test, but I'm assuming this was a comparison of CPU based encoders only?
    Last edited by hello_hello; 18th Nov 2015 at 08:24. Reason: spelling
    Quote Quote  
  4. Member
    Join Date
    Jan 2012
    Location
    Budapest
    Search Comp PM
    Originally Posted by hello_hello View Post
    I read through the pdf until my head started to hurt, but I think the gist of it was they tested a bunch of encoders using two different PCs (desktop and server) and the ran three different tests on each, supposedly comparing the bitrate each encoder required to achieve a fixed quality.
    One test had no speed restrictions, the second required a minimum encoding speed (although different for desktop and server) and the third required an even higher minimum speed, so in the case of x265 (and x264) the encoder's speed preset needed to be changed accordingly.
    It appears x265 was considered the best (in respect to bitrate) when there was no minimum speed requirement and still the best when the minimum speed requirement was fairly low and dropped to 3rd or 4th place when the minimum speed requirement was higher. It was interesting to see x264 was used as a reference point, and wasn't doing too badly, not all that far behind the better x265 encoders and better than some.

    One thing I'm still not clear on is whether 2 pass encoding was used for every codec or when it was, as was the case for x264 and x265, if the minimum speed requirement included the first pass or if it was both passes combined.

    And I'm not familiar with many of the encoders in the test, but I'm assuming this was a comparison of CPU based encoders only?
    Yes, they were CPU codecs tests. Imagine: the GPU and other HW encoders have even worse quality....
    Quote Quote  
  5. I'm still trying to get my head around it a little in respect to how they tested. My brain expects to compare the quality of encoders at a fixed bitrate, but the test is comparing the amount of compression achieved for a fixed quality. I understand it can be done either way but I'm not sure how they determined the bitrate to use for a particular encoder (it seems to be a case of running encodes while increasing the bitrate until the target quality was reached) and I'm not sure why it'd necessitate using 2 pass encoding, at least for x264 and x265, which is why I wondered whether the minimum encoding speed requirements only applied to the second pass. It doesn't seem as though HEVC is setting the world on fire yet though.

    According to the pdf the "desktop fast encoding" test had a minimum speed requirement of 30fps, while for the "server fast encoding" test it was 60fps, and each time the medium preset was used for the x264 encoder, while for x265 the ultrafast preset was used for the desktop test and the superfast preset was used for the server test. Logically I'd have thought for x265 the speed presets would be the other way around (superfast for the desktop test and ultrafast for the server test), and the same preset each time for x264 doesn't make complete sense to me. The desktop and server PCs had completely different CPUs so maybe that'd explain it, but it doesn't seem logical on the face of it, especially as the "desktop fast encoding test" was the only one where x264 beat x265.
    Quote Quote  
  6. Originally Posted by Stears555 View Post

    Yes, they were CPU codecs tests. Imagine: the GPU and other HW encoders have even worse quality....
    Well... it was also tested Intel GPGPU encoder (albeit this is not clear - not sure if this is real GPGPU or QSV encoder).
    From my perspective i saw some inconsistencies - same as in previous years - using PSNR or SSIM tuning where other encoders have no this kind of function or such function is not enabled seem to be not fortunate idea... glad to know reason behind such approach.
    Quote Quote  
  7. Member
    Join Date
    Jan 2012
    Location
    Budapest
    Search Comp PM
    Originally Posted by pandy View Post
    Originally Posted by Stears555 View Post

    Yes, they were CPU codecs tests. Imagine: the GPU and other HW encoders have even worse quality....
    Well... it was also tested Intel GPGPU encoder (albeit this is not clear - not sure if this is real GPGPU or QSV encoder).
    From my perspective i saw some inconsistencies - same as in previous years - using PSNR or SSIM tuning where other encoders have no this kind of function or such function is not enabled seem to be not fortunate idea... glad to know reason behind such approach.
    MSU have an own SSIM and PSNR software, so it is not important that which codecs/encoders have own built-in measurement software.
    Quote Quote  
  8. Originally Posted by Stears555 View Post

    MSU have an own SSIM and PSNR software, so it is not important that which codecs/encoders have own built-in measurement software.
    And?

    My previous message refer to psychovisual tuning (or rather lack of it) used to provide PSNR/SSIM values... this is not related to internal numbers reported by encoder but to way how video is encoded.
    Quote Quote  
  9. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    Should point out that they used x265 1.5, and we are already on 1.8 with serious speed improvements.

    Originally Posted by Stears555 View Post
    As I said, HEVC was not a breaktrough
    Unlike the phantom V-NOVA/PERSEUS that you keep cheerleading?

    Originally Posted by Stears555 View Post
    Yes, they were CPU codecs tests. Imagine: the GPU and other HW encoders have even worse quality....
    Unlike H.264?
    Quote Quote  
  10. Originally Posted by pandy View Post
    same as in previous years - using PSNR or SSIM tuning where other encoders have no this kind of function or such function is not enabled seem to be not fortunate idea... glad to know reason behind such approach.
    I hadn't noticed they used the SSIM tuning for x264 but not x265. Maybe the latter doesn't have such a tuning but it does make me wonder how the two would have compared with neither of them using it.
    Quote Quote  
  11. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    Originally Posted by hello_hello View Post
    I hadn't noticed they used the SSIM tuning for x264 but not x265. Maybe the latter doesn't have such a tuning but it does make me wonder how the two would have compared with neither of them using it.
    x265 does have SSIM and PSNR tuning.
    Quote Quote  
  12. Member ricardouk's Avatar
    Join Date
    Mar 2005
    Location
    Portugal
    Search Comp PM
    Originally Posted by Ittiam blog
    ....Also, in the universal and fast transcode categories, the encoder was configured to run single pass CBR (as compared to 2-pass VBR being used for x264).
    http://www.ittiam.com/blog/reflections-on-the-first-msu-hevc-encoder-comparison-study-report/
    I love it when a plan comes together!
    Quote Quote  
  13. Member
    Join Date
    Jan 2014
    Location
    Kazakhstan
    Search Comp PM
    The quality of 4K will be much higher in x265.
    Quote Quote  
  14. Hi Stears555,

    Can you please tell me how I would get H.264 comparable to H.265 in terms of quality/size? I am using Handbrake. Usually I would pick Q23 for H.265 and Q20 for H.264, but H.265 is always much smaller in filesize compared to the above graph, so am I over compensating? How should I change my settings?

    Any help would be much appreciated!
    Quote Quote  
  15. Originally Posted by Gravitator View Post
    The quality of 4K will be much higher in x265.
    Why?
    Quote Quote  
  16. Member x265's Avatar
    Join Date
    Aug 2013
    Location
    Sunnyvale, CA
    Search Comp PM
    Originally Posted by KarMa View Post
    Originally Posted by hello_hello View Post
    I hadn't noticed they used the SSIM tuning for x264 but not x265. Maybe the latter doesn't have such a tuning but it does make me wonder how the two would have compared with neither of them using it.
    x265 does have SSIM and PSNR tuning.
    Yeah, this was a mistake. We didn't realize that quality would be judged solely on Y-SSIM. We would have compared better if we had asked MSU to use --tune ssim.
    Quote Quote  
  17. Originally Posted by hello_hello View Post
    Originally Posted by Gravitator View Post
    The quality of 4K will be much higher in x265.
    Why?
    Because H.264 is limited in most HW decoders to HP@L4.2...
    Quote Quote  
  18. Originally Posted by hello_hello View Post
    Originally Posted by Gravitator View Post
    The quality of 4K will be much higher in x265.
    Why?
    Mostly because of larger block size (CTU) . Even on early HEVC implementation tests, AVC didn't come close either in objective metrics or subjective analysis at UHD resolutions
    Quote Quote  
  19. But L4.2 doesn't include UHD, does it?
    Hardware decoder limitations aside, are were referring to higher quality at a given bitrate, or HEVC being capable of better quality encoding?
    Quote Quote  
  20. Originally Posted by hello_hello View Post
    But L4.2 doesn't include UHD, does it?
    Hardware decoder limitations aside, are were referring to higher quality at a given bitrate, or HEVC being capable of better quality encoding?
    Indeed - that's why for UHD everyone use H.265 - of course my answer for your question was different than poisondeathray - H.265 refined many elements known from H.264 that provide additional gain for UHD - CTU size is one of biggest but generally for UHD is easier to found similarities and reuse them and as a final effect reduce bitrate...
    Quote Quote  
  21. So many things wrong with this test:

    A) They used Haswell based cpu's and then tested the Intel MSS HEVC encoder in software and Hybrid SW/HW mode. This gives a warped view of what the Intel HEVC encoder is really capable of, they should have tested with a Skylake as it's the first Intel cpu to support FULL H/W HEVC encoding; the speed and quality tests would have had different results.

    B) They used incorrect parameters for VP9. As noted by the VP9 developers MSU used the parameter --cpu-used=1, as noted a value of 0 would have improved compression by about 10%. Furthermore, VP9 has some advanced features, such as Golden I Frames, which allows the encoder to encode higher quality I frames which are then used as references for other frames; likewise VP9 also features the ability called spatial resampling which allows the encoder to encode a lower rez version of a frame and then upscale it during playback, often times this results in a higher overall quality, I do not believe this option was enabled for VP9 in this test.

    C) The test sequences are way to short; some are less than 400 frames long. X264 defaults to a 250 frame GOP and the developers specify a GOP length of 500 frames for the MSU test. That leaves in some cases 1 or 2 I frames for the entire sequence, which can and does unfairly impact quality measurements.

    D) The "Ripping" test features no minimum speed requirement, meaning that they were free to use "placebo" for both x264 and x265, settings that realistically are not an option for normal users, especially in the case of x265 encoding as it would take too long.

    E) Lastly it unfairly biases the test when you normalize for one codec, in this case x264. The test results should not be relative to one of the competitors, i.e. one competitor is 100% and everyone else is a percentage value of that competitor. A better test would be to test SSIM Global, SSIM Average, PSNR Global and PSNR average for all encoders at the same bit rate, i.e. 2000kps, 4000kps and so on.

    I consider this test an overall FAIL.
    Quote Quote  
  22. Video Restorer lordsmurf's Avatar
    Join Date
    Jun 2003
    Location
    dFAQ.us/lordsmurf
    Search Comp PM
    I've never been impressed by MSU for testing.
    They have too many testing flaws, or too much bias, or both.

    In fact, lots of their stuff is unimpressive. Lots of hoopla that fizzles out quickly.
    Want my help? Ask here! (not via PM!)
    FAQs: Best Blank DiscsBest TBCsBest VCRs for captureRestore VHS
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!