VideoHelp Forum




+ Reply to Thread
Page 2 of 3
FirstFirst 1 2 3 LastLast
Results 31 to 60 of 78
  1. Originally Posted by sophisticles View Post

    It does speak volumes, it says that x264 and the clown show behind their development don't amount to a pimple on the ass of the major commercial offerings.

    DivX bought Main Concept for 22 million bucks plus another 6 million for reaching certain goals about 13 years ago, then Rovi sold both DivX and MC for 75 million, I can't find how much Silicon Philosophies, the people that own CCE are worth as a company but I think they have to be generating a pretty decent buck considering how many major broadcasters use their encoder, Ateme has a market cap rate of just under 130 million dollars, these guys have no desire to compare their software to a free alternative; it's kind of like why Ford, Dodge and Toyota will compare their vehicles to Ferrari, Porsche and Maserati but the opposite is not the case.

    Those companies have many codecs and plugins, and some have full software GUI offerings in their portfolio, along with support sales. ie. It's not only an "AVC encoder" driving the sales,. eg. Mainconcept licenses DV, MPEG2, HDV, XDCAM variants, WMV along with AVC to software NLE's like Adobe, Sony etc.. . Some of those companies sell hardware along with software, like Ateme. CCE-HD is a full BD encoding suite software with segment encoding. In contrast, x264 is only an AVC encoder , no GUI

    And since the topic is h264 encoder - x264 was (still is?) widely viewed as the top AVC encoder, quality wise. Video professionals know this. It's still the reference benchmark used for testing new codecs like HEVC, AV1. It's comparing against a metric, not necessarily against another encoder. If you score high dB PSNR, you're "king" in areas like broadcast, quality wise - they still use PSNR widely. But "quality" is not the primary concern in broadcast as mentioned earlier. BD has other priorities such as segment encoding that x264 does not offer. For the car analogy, if you can claim the fastest Nurburgring time, you bet a company Porsche will scream from the rooftops - like they have - until they lost it. Similarly if an encoder can claim highest PSNR (or whatever metric) scores, you bet they will claim it.


    I have said that it has a reputation for being the best that is unjustified
    How is it that reputation "unjustified" if it was at the top (or very near the top) in every comparison, every category, back when it was widely used and tested ? Ateme offerings were tested too and Mainconcept. Hundreds of published tests, thousands of unpublished tests, dozens of different types of videos and genres - all showing the same trend. A mountain of solid evidence. Yet people are supposed to believe your "feelings?" Because you have no evidence to support your claim?

    In fact, I have repeatedly said that all encoders are pretty much the same so long as someone doesn't bit rate starve their encodes.
    And you've repeatedly been proven wrong. They are only the "same" to you. You can't see the difference between one that is 2x bitrate, or youtube encode - it's pretty bad , you must admit . Some encoders "starve" at much higher bitrates. That's the point of compression. If some encoder needs 1.5x more bitrate to reach a certain dB PSNR target, that encoder is clearly worse. Verify with other metrics too like SSIM, VMAF, PSNRHVS , etc.. pick your poison. Check with someone else's eyes too (subjective testing)
    Quote Quote  
  2. Video Restorer lordsmurf's Avatar
    Join Date
    Jun 2003
    Location
    dFAQ.us/lordsmurf
    Search Comp PM
    Originally Posted by Cornucopia View Post
    It was superior in quality. It was never superior in speed unless you threw quality out the window.

    Soph actually did acqiesce and summed it up: for all low/no-cost, and all low-mid bitrate, and all varied-use and software-based encodes, x264 and x265 are the best in their class. I think that's usually enough for most users here.

    Scott
    I think the main difference is this:

    - MainConcept (and other costly H.264 encoders) are best with quality from master sources.
    - x264 is best for re-encoding smaller, especially at tiny bitrates.

    Most people here are wanting to re-encode DVDs/BDs, downloads (torrents, Youtube, whatever), or even post-broadcasted quality. But that's all pretty low-end to mastery sources. x264 doesn't even fair as well from lossless VHS encodes, and it often goes too soft and mushy without a high bitrate (much like TMPGEnc Plus was with MPEG encoding).

    x264 isn't bad, but the homebrew community puts it up on am undeserved pedestal.

    I like x264, especially using Hybrid. But I'll equally use MainConcept when needed.
    Want my help? Ask here! (not via PM!)
    FAQs: Best Blank DiscsBest TBCsBest VCRs for captureRestore VHS
    Quote Quote  
  3. Originally Posted by lordsmurf View Post
    - MainConcept (and other costly H.264 encoders) are best with quality from master sources.
    Not what the tests show. Many of the tests are done with mastering sources, graded, prores, pre BD .

    The higher the quality the source, the less fine details Mainconcept or most other AVC encoders retail. x264 is known for it's grain and detail retention, while other AVC encoders typically produce soft, smooth and mushy results, especially mainconcept. Mainconcept will even drop complete objects, like rocks, buttons - this was demonstrated in a recent thread on a Netflix master source. At any bitrate range, the RD curves show x264 has the advantage.

    (To be fair, the newest Mainconcept version has shown some improvements, it's not quite as blurry. But the old tests, 8-12 years ago, the difference was massive ~1.5x -2x the bitrate required)

    - x264 is best for re-encoding smaller, especially at tiny bitrates.
    Yes, it excels at low bitrates. The delta between x264 and most other AVC encoders is even larger. But the rd curves on almost every test show it retains the advantage at mid to high bitrates . It doesn't suddenly get worse or drop off

    . x264 doesn't even fair as well from lossless VHS encodes, and it often goes too soft and mushy without a high bitrate (much like TMPGEnc Plus was with MPEG encoding).
    Neither do the other encoders. They perform even worse than x264 . It's essentially the same concept as grain and retention. x264 retains noise and grain better, because it keeps higher frequency details better. (You can adjust it to drop details and blur like other encoders, by tuning it for PSNR)

    High bitrates for a noisy source , is now a relatively "low bitrate range" for a difficult to compress source. Assuming you want to retain the details and noise (ie. similar to source). x264 doesn't suddenly lose it's advantage, it's actually at the sweet spot for low bitrates relative to content complexity
    Quote Quote  
  4. I told myself I was not going to be drawn into this, the OP's question was clearly a trolling attempt, flame bait designed to start a discussion such as this one, and I fell for it like a dumb ass.

    But, once you're in the dance, you may as well boogie down, so, I accept PDR's challenge.

    I decided to do a couple of test encodes with sources that are usually not used, in this case Black Magic's ProRes samples that can be found here:

    https://www.blackmagicdesign.com/products/blackmagicpocketcinemacamera/workflow

    On thing to note, because I'm working on Ubuntu and ProRes, and Intel's QS, is a bitch to work with on Linux, I had to use a rather convoluted workflow for this test.

    Notes:

    The test sequence I used was Food Blogger Mov, which is 3840x2160 ProRes that needs to be cropped to 3840x1600. I wanted to use AviDemux for a variety of reason but this software does not play nice with ProRes, it "pretends" to open it but when you try and produce a usable file, you get a file that appears to be the right size but has no video data in it.

    ShotCut will open these MOV files and you can scrub through the timeline and apply filters, but it will not crop them properly, it's difficult to get the DAR to be correctly encoded and ShotCut on Linux absolutely refuses to adhere to bit rate limits with regards to QS.

    I could have used ffmpeg, but while compiling ffmpeg support with vaapi is relatively easy, it is a big pain in the ass to get qsv working and if you are going to compile ffmpeg from source you may well build it with qsv, because qsv exposes settings that substantially improve quality, such as trellis, mbbrc or extbrc, which significantly improves quality that I can notice under nearly all circumstances.

    In the end, I wanted to make this as fairly easy as possible, so I settled on this somewhat hacky methodology --. load source into Handbrake, let is crop and export as a lossless (CRF 0) x264+ultrafast. I then loaded that into AviDemux and did a bunch of encodes, all of them set for 10 mb/s. The x264 encode used the fast preset which has all the psycho-visual BS enabled but I only used a 1 pass constant bitrate setting. Why? Because Intel's encoder does not support a 2 pass and AviDemux does not expose a 1 pass vbr for x264 nor does it expose a CRF+max bitrate setting. This may cripple the x264 encoder in some faithful's eyes, but Intel is also crippled because AviDemux uses vaapi, which only allows for a limited subset of settings to be used, AviDemux doesn't expose any quality settings at all for Intel and QSV is capable of offering higher quality than VAAPI. I figured any supposed disadvantage bestowed on x264 was more than offset by the limitations imposed in Intel.

    One more thing, I did so many encodes and I walk away from my encoding PC, consequently I kind of forgot the settings I used for the Intel encodes. I know some of them used the default GOP=100 and 2 b frames, but I also know I did 1 with GOP=250 and 3 b frames (like x264), 1 each with GOP=24 and no b frame, and 1 each with GOP=12 and 0 b frames. I also know the last Intel AVC encode I did I added a denoise and sharpening filter.
    Quote Quote  
  5. Originally Posted by lordsmurf View Post
    x264 doesn't even fair as well from lossless VHS encodes, and it often goes too soft and mushy without a high bitrate (much like TMPGEnc Plus was with MPEG encoding).
    I'm going to partially eat my words here.

    I revisited this today. The newest MC does slightly better noisy MBAFF encoding, such as from VHS . It's better in some ways, and dramatically improved from previous MC generations. It's only a few tests, but from the trends I see, this appears to be the case now (I made settings comparable, like refs, b-frames, GOP size etc.. ) .

    The problem with x264's grain settings, is it tries emulates the original "noise" pattern, but it is slightly different , a slight deviation . Overall it can look quite close, but you can see differences when you go frame by frame or look closely. That's from the psy-rd, psy-trellis . It's almost as if it adds a bit of extra noise - maybe you could tweak it or turn it down a bit , I didn't play with it too much

    MC is weaker and softer by default, more is lost (compared to x264 using grain settings; if you didn't use grain settings, then x264 is softer), but the type and position resembles the original pattern more closely, but it's as if it was lightly denoised. MC is more mushy, especially in the shadows (but you can change some of the AQ settings, but I didn't here)

    MC used to look like it just heavily denoised everything or blurred everything. I didn't measure metrics, but I'm quite surprised at the results, it's a lot closer now
    Last edited by poisondeathray; 3rd May 2020 at 16:22.
    Quote Quote  
  6. Originally Posted by sophisticles View Post
    The x264 encode used the fast preset which has all the psycho-visual BS enabled but I only used a 1 pass constant bitrate setting.
    Why? Because Intel's encoder does not support a 2 pass and AviDemux does not expose a 1 pass vbr for x264 nor does it expose a CRF+max bitrate setting. This may cripple the x264 encoder in some faithful's eyes, but Intel is also crippled because AviDemux uses vaapi, which only allows for a limited subset of settings to be used, AviDemux doesn't expose any quality settings at all for Intel and QSV is capable of offering higher quality than VAAPI. I figured any supposed disadvantage bestowed on x264 was more than offset by the limitations imposed in Intel.
    Useless test. 1 pass CBR fast ?

    Don't use avidemux if it cripples everything

    Use settings that people would typically use, or design the test that shows how the encoder is supposed to work. Not some artificial limitations
    Quote Quote  
  7. Originally Posted by sophisticles View Post
    Originally Posted by davexnet View Post
    I have a 2010 release Mainconcept Reference on an XP install. For lower bitrates (I chose 2400 kbps for 1280x720),
    it was clearly worse than today's X264, with noticeable difficulties in flat parts.

    However, I also downloaded a new demo of MC Totalcode and this is much improved. Which one is better now is not so obvious.

    The source is 1080p, movie preview, Bluray source. Resized in the encoders (MCTC 5.0 and Vidcoder X264 latest beta) to 720p.
    Thank you for this, if truth be told I am unable to see a difference between any of them in the environment I am now, sitting in my well lit office in front of an AIO with overhead lights on and plenty of sunshine, all three look very similar to me.
    Did you get a chance to have a better look or are you hoping to ignore the result and nobody will notice? It's quite clear x264 retained more detail wherever there's movement.

    Originally Posted by sophisticles View Post
    I decided to do a couple of test encodes with sources that are usually not used, in this case Black Magic's ProRes samples that can be found here.
    I'm baffled as to what the samples you posted are supposed to show. That you can encode video at a decent bitrate and achieve a decent quality?

    Once again you chose to use settings for x264 that few people in the real world would ever use. At least not here. I only looked at the first two samples as they don't have much to do with the discussion. The second one looked very marginally better, but all you've done is prove any encoder can produce a decent quality encode at a high enough bitrate, even after you crippled x264. You could've used 2 pass VBR encoding for x264 as you know it uses the same encoding method as CRF, and it'd more relevant and closer to reality than the CBR encode you offered. Even ABR probably would've been better, although at least x264 still seemed to put the keyframes in sensible places. ie the first frame of a new scene.

    How about some samples at low bitrates so we can see where the different encoders start to fall over, quality-wise?
    How about something that's not 4k? We're all not obsessed with it, the majority of people here probably don't have 4k monitors yet (I don't), and we know h265 is designed with 4k in mind, not h264.
    Last edited by hello_hello; 3rd May 2020 at 17:44.
    Quote Quote  
  8. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    Originally Posted by Mr_khyron View Post
    That's the first time I've seen that. Even Youtubers with millions of views don't get the x264 treatment. https://www.youtube.com/watch?v=MHaQETqabDg

    The same channel but with x264. It was uploaded in 2018 but has a x264 encode date of Jan 2020. Why the recent encoding? This content has already gotten most of the views it will ever get. https://www.youtube.com/watch?v=ywcY8TvES6c&t

    A much smaller channel without x264. https://www.youtube.com/watch?v=wME4_ozQ-LA

    The same small channel with one of their biggest videos, still no x264. https://www.youtube.com/watch?v=_sjZJ9i_mH0

    Code:
    General
    Complete name                  : C:\How to be an Online Voice Actor-_sjZJ9i_mH0.mp4.part
    Format                         : dash
    Codec ID                       : dash (iso6/avc1/mp41)
    File size                      : 805 KiB
    Duration                       : 5 s 380 ms
    Overall bit rate               : 1 227 kb/s
    Encoded date                   : UTC 2018-11-19 05:12:16
    Tagged date                    : UTC 2018-11-19 05:12:16
    IsTruncated                    : Yes
    
    Video
    ID                             : 1
    Format                         : AVC
    Format/Info                    : Advanced Video Codec
    Format profile                 : Main@L3.1
    Format settings                : CABAC / 3 Ref Frames
    Format settings, CABAC         : Yes
    Format settings, Reference fra : 3 frames
    Codec ID                       : avc1
    Codec ID/Info                  : Advanced Video Coding
    Duration                       : 5 s 380 ms
    Bit rate                       : 1 241 kb/s
    Width                          : 1 280 pixels
    Height                         : 720 pixels
    Display aspect ratio           : 16:9
    Frame rate mode                : Constant
    Frame rate                     : 23.976 (24000/1001) FPS
    Color space                    : YUV
    Chroma subsampling             : 4:2:0
    Bit depth                      : 8 bits
    Scan type                      : Progressive
    Bits/(Pixel*Frame)             : 0.056
    Stream size                    : 815 KiB
    Title                          : ISO Media file produced by Google Inc. Created on: 11/18/2018.
    Encoded date                   : UTC 2018-11-19 05:12:16
    Tagged date                    : UTC 2018-11-19 05:12:16
    Color range                    : Limited
    Color primaries                : BT.709
    Transfer characteristics       : BT.709
    Matrix coefficients            : BT.709
    Codec configuration box        : avcC
    If I myself upload something, I will NOT get the x264 treatment. Even videos dated back a few years are still nonx264. I got VP9 added recently added but not x264. Same thing will be true for most people who use the platform.

    So I take it back, they do currently use x264, just only for their very top viewed content. Which the majority reading this will never attain, at Google's current gatekeeping to x264. Which then begs the obvious question, why would Youtube limit x264 usage to only their top viewed content. It's almost like Google thinks x264 is better than their hardware encoding setup, weird.
    Last edited by KarMa; 3rd May 2020 at 16:34.
    Quote Quote  
  9. Originally Posted by KarMa View Post

    If I myself upload something, I will NOT get the x264 treatment. Even videos dated back a few years are still nonx264. I got VP9 added recently added but not x264. Same thing will be true for most people who use the platform.

    So I take it back, they do currently use x264, just only for their very top viewed content. Which the majority reading this will never attain, at Google's current gatekeeping to x264. Which then begs the obvious question, why would Youtube limit x264 usage to only their top viewed content. It's almost like Google thinks x264 is better than their hardware encoding setup, weird.
    I'm not sure if that's the entire explanation; It doesn't quite fully explain it

    I check a few old videos on my test account and most of mine don't have the x264 writing library


    But a few do . 40 views, unlisted, 4 years old. But "3D" .
    eg.
    https://www.youtube.com/watch?v=5t46EQ9DLFI

    Code:
    Video
    ID                                       : 1
    Format                                   : AVC
    Format/Info                              : Advanced Video Codec
    Format profile                           : High@L4
    Format settings                          : CABAC / 2 Ref Frames
    Format settings, CABAC                   : Yes
    Format settings, Reference frames        : 2 frames
    Codec ID                                 : avc1
    Codec ID/Info                            : Advanced Video Coding
    Duration                                 : 46 s 713 ms
    Bit rate                                 : 2 599 kb/s
    Width                                    : 1 920 pixels
    Height                                   : 1 080 pixels
    Display aspect ratio                     : 16:9
    Frame rate mode                          : Constant
    Frame rate                               : 23.976 (24000/1001) FPS
    Color space                              : YUV
    Chroma subsampling                       : 4:2:0
    Bit depth                                : 8 bits
    Scan type                                : Progressive
    Bits/(Pixel*Frame)                       : 0.052
    Stream size                              : 14.5 MiB (100%)
    Title                                    : ISO Media file produced by Google Inc. Created on: 04/04/2019.
    Writing library                          : x264 core 155 r2901 7d0ff22
    Encoded date                             : UTC 2019-04-05 02:54:39
    Tagged date                              : UTC 2019-04-05 02:54:39
    Color range                              : Limited
    Color primaries                          : BT.709
    Transfer characteristics                 : BT.709
    Matrix coefficients                      : BT.709
    Codec configuration box                  : avcC
    Duration does not seem to play a role , that was 46 sec . I was thinking maybe only longer videos got the treatment

    Something else - that was uploaded 4 years ago, but the encoded/tagged date is UTC 2019-04-05 02:54:39 according to mediainfo . Did they "redo" some videos ? Like they redid VP9 for some videos ? And why this one ? It's unlisted, and I disabled everything like adverts, comments etc... It's a test account

    But when I checked some older ones, also few years ago, they have UTC date/time basically right now (or at least today), but no x264. Probably a live encode.
    Last edited by poisondeathray; 3rd May 2020 at 17:03.
    Quote Quote  
  10. Capturing Memories dellsam34's Avatar
    Join Date
    Jan 2016
    Location
    Member Since 2005, Re-joined in 2016
    Search PM
    Why threads are turning violent? Can we just get along and help each other?
    Quote Quote  
  11. Originally Posted by poisondeathray View Post
    Useless test. 1 pass CBR fast ?

    Don't use avidemux if it cripples everything

    Use settings that people would typically use, or design the test that shows how the encoder is supposed to work. Not some artificial limitations
    This is the type of crap that drives me absolutely crazy:

    I am not a mind reader and I do not know what settings "people would typically use", I don't even know what that means. If you are referring to the highly over-rated CRF mode, we have gone over this ad nauseam, including posting source code that shows the mode some seem to thing is some sort of magical, miraculous invention is really simply ABR mode with QP shoe-horned in.

    As for this silly statement "design the test that shows how the encoder is supposed to work", how is x264 supposed to work? The developers went through the effort to write the code that allows these settings to be used, if they didn't want their over-hyped encoder used in this fashion then why bother writing the code that allows it?

    But you just like outing yourself every chance you get as a zealot, like all the other x264 worshipers, if some test produces results that they don't like, they have a fit.

    But I'm feeling very generous, so here's what I did for you and the rest, I cheated in x264's favor, here's a 2 pass 10 mb/s x264 encode with preset slow and tune film. How does that grab you, have I biased the test in x264's favor enough?

    @hello_hello: You consider 10 mb/s for 3840x1600 4k to be too much bit rate? This is where the x264 faithful and I differ. The Blu-Ray spec allows for up to 54 mb/s for video which can be a maximum of 1920x1080p60, UHD BD allows for up to 144 mb/s which can be a maximum of 3840x1600p60, yet using greater than 5 times less bit rate than BD for a resolution that is 4 times bigger than BD is somehow considered "a decent amount".

    People complain about the quality of Blu-Ray and UHD BD all the time and you guys want to use less than 10 mb/s for 4k?

    Just to prove you guys wrong, I also did more tests, x264+crf 23+tune film+medium, which I'm sure you will agree is closer to "what most people would use". Want to guess how much bit rate x264 used? 19.3 mb/s. So in the interest of fairness i did Intel AVC and Intel HEVC encode at that bit rate. You will notice that the Intel encoder used less bit rate than x264 and still managed to match the quality of the vastly over-rated x264.

    Go ahead and tell me how x264 is vastly superior to all other encoders
    Image Attached Files
    Quote Quote  
  12. Originally Posted by KarMa View Post
    If I myself upload something, I will NOT get the x264 treatment. Even videos dated back a few years are still nonx264. I got VP9 added recently added but not x264. Same thing will be true for most people who use the platform.

    So I take it back, they do currently use x264, just only for their very top viewed content. Which the majority reading this will never attain, at Google's current gatekeeping to x264. Which then begs the obvious question, why would Youtube limit x264 usage to only their top viewed content. It's almost like Google thinks x264 is better than their hardware encoding setup, weird.
    Dude, i don't know what you are smoking but I'm telling you as a man that has worked in ACDS, get some help. You actually think that Google goes through the trouble of using 2 different AVC encoders, a hardware encoder and x264 for content that is "top viewed"?

    You can't be serious, my God man, get a grip.
    Quote Quote  
  13. Originally Posted by poisondeathray View Post
    Originally Posted by KarMa View Post

    If I myself upload something, I will NOT get the x264 treatment. Even videos dated back a few years are still nonx264. I got VP9 added recently added but not x264. Same thing will be true for most people who use the platform.

    So I take it back, they do currently use x264, just only for their very top viewed content. Which the majority reading this will never attain, at Google's current gatekeeping to x264. Which then begs the obvious question, why would Youtube limit x264 usage to only their top viewed content. It's almost like Google thinks x264 is better than their hardware encoding setup, weird.
    I'm not sure if that's the entire explanation; It doesn't quite fully explain it

    I check a few old videos on my test account and most of mine don't have the x264 writing library


    But a few do . 40 views, unlisted, 4 years old. But "3D" .
    eg.
    https://www.youtube.com/watch?v=5t46EQ9DLFI

    Code:
    Video
    ID                                       : 1
    Format                                   : AVC
    Format/Info                              : Advanced Video Codec
    Format profile                           : High@L4
    Format settings                          : CABAC / 2 Ref Frames
    Format settings, CABAC                   : Yes
    Format settings, Reference frames        : 2 frames
    Codec ID                                 : avc1
    Codec ID/Info                            : Advanced Video Coding
    Duration                                 : 46 s 713 ms
    Bit rate                                 : 2 599 kb/s
    Width                                    : 1 920 pixels
    Height                                   : 1 080 pixels
    Display aspect ratio                     : 16:9
    Frame rate mode                          : Constant
    Frame rate                               : 23.976 (24000/1001) FPS
    Color space                              : YUV
    Chroma subsampling                       : 4:2:0
    Bit depth                                : 8 bits
    Scan type                                : Progressive
    Bits/(Pixel*Frame)                       : 0.052
    Stream size                              : 14.5 MiB (100%)
    Title                                    : ISO Media file produced by Google Inc. Created on: 04/04/2019.
    Writing library                          : x264 core 155 r2901 7d0ff22
    Encoded date                             : UTC 2019-04-05 02:54:39
    Tagged date                              : UTC 2019-04-05 02:54:39
    Color range                              : Limited
    Color primaries                          : BT.709
    Transfer characteristics                 : BT.709
    Matrix coefficients                      : BT.709
    Codec configuration box                  : avcC
    Duration does not seem to play a role , that was 46 sec . I was thinking maybe only longer videos got the treatment

    Something else - that was uploaded 4 years ago, but the encoded/tagged date is UTC 2019-04-05 02:54:39 according to mediainfo . Did they "redo" some videos ? Like they redid VP9 for some videos ? And why this one ? It's unlisted, and I disabled everything like adverts, comments etc... It's a test account

    But when I checked some older ones, also few years ago, they have UTC date/time basically right now (or at least today), but no x264. Probably a live encode.
    Are you also smoking something hallucinogenic? Are you familiar with Occam's razor? The most likely explanation is that they strip the meta data from some streams; my guess is that they use a massive encoding farm for all the videos that get posted and are probably running multiple instances of ffmpeg, someone probably used a slightly different command line for the ffmpeg used by some servers vs what's used by other servers.

    They've been using x264 for years, I see no reason for them to use 2 different AVC encoders.
    Quote Quote  
  14. Originally Posted by sophisticles View Post


    This is the type of crap that drives me absolutely crazy:

    I am not a mind reader and I do not know what settings "people would typically use", I don't even know what that means. If you are referring to the highly over-rated CRF mode, we have gone over this ad nauseam, including posting source code that shows the mode some seem to thing is some sort of magical, miraculous invention is really simply ABR mode with QP shoe-horned in.

    As for this silly statement "design the test that shows how the encoder is supposed to work", how is x264 supposed to work? The developers went through the effort to write the code that allows these settings to be used, if they didn't want their over-hyped encoder used in this fashion then why bother writing the code that allows it?

    But you just like outing yourself every chance you get as a zealot, like all the other x264 worshipers, if some test produces results that they don't like, they have a fit.

    But I'm feeling very generous, so here's what I did for you and the rest, I cheated in x264's favor, here's a 2 pass 10 mb/s x264 encode with preset slow and tune film. How does that grab you, have I biased the test in x264's favor enough?
    How's that "in favor" ? That's using it under normal operation. You either use CRF or 2pass . You compare directly at same bitrates, or do multiple encodes and plot RD curves. This is basic 101 comparison methodology. This is what every test uses. That's what our Netflix Meridian test used (2pass for everything)

    1pass ABR is always worse for any encoder, there is no reason to use it. 1pass CBR is absurd. You don't handicap an encoder on purpose. You don't turn off a few cylinders in a V8 engine during a race.

    You're just being plain silly now

    Just to prove you guys wrong, I also did more tests, x264+crf 23+tune film+medium, which I'm sure you will agree is closer to "what most people would use". Want to guess how much bit rate x264 used? 19.3 mb/s. So in the interest of fairness i did Intel AVC and Intel HEVC encode at that bit rate. You will notice that the Intel encoder used less bit rate than x264 and still managed to match the quality of the vastly over-rated x264.
    What are you looking at ?

    The quality is significantly worse for Intel AVC, too much details and textures blurred away.

    But it's not comparable for a direct comparison, the bitrate is too low for Intel AVC



    Originally Posted by sophisticles View Post
    Are you also smoking something hallucinogenic? Are you familiar with Occam's razor? The most likely explanation is that they strip the meta data from some streams; my guess is that they use a massive encoding farm for all the videos that get posted and are probably running multiple instances of ffmpeg, someone probably used a slightly different command line for the ffmpeg used by some servers vs what's used by other servers.

    They've been using x264 for years, I see no reason for them to use 2 different AVC encoders.
    You're just guessing. Why strip metadata for some but not others . It doesn't make sense

    I thought/assumed they used x264 too, for everything AVC
    Quote Quote  
  15. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    Originally Posted by poisondeathray View Post
    I'm not sure if that's the entire explanation; It doesn't quite fully explain it

    I check a few old videos on my test account and most of mine don't have the x264 writing library

    But a few do . 40 views, unlisted, 4 years old. But "3D" .
    I know you can force Youtube to encode your videos with VP9 if you do 1440p or higher uploads. Maybe they treat 3D in a similar fashion, if they assume your content is 3D and might treat it different. Are all of these x264 videos of yours 3D?
    Quote Quote  
  16. Video Restorer lordsmurf's Avatar
    Join Date
    Jun 2003
    Location
    dFAQ.us/lordsmurf
    Search Comp PM
    Originally Posted by KarMa View Post
    Which then begs the obvious question, why would Youtube limit x264 usage to only their top viewed content. It's almost like Google thinks x264 is better than their hardware encoding setup, weird.
    Good question, but the supposed answer is a leap. The main reason x264 is used is due to licensing costs, period.

    Originally Posted by sophisticles View Post
    I do not know what settings "people would typically use", I don't even know what that means.
    I equally have no clue what this means.

    H.264 has so many uses that there is not a "typical" usage, as there was with MPEG (and specs like DVD-Video).

    Originally Posted by poisondeathray View Post
    the less fine details
    You focus on detail almost 100%, and that's fine. But there's more to encoding than detail. (And detail isn't always detail anyway, but rather noise that is perceived as detail.)

    Originally Posted by poisondeathray View Post
    I revisited this today. The newest MC does slightly better noisy MBAFF encoding, such as from VHS
    MC is weaker and softer by default, more is lost (compared to x264 using grain settings; if you didn't use grain settings, then x264 is softer), but the type and position resembles the original pattern more closely, but it's as if it was lightly denoised.
    MainConcept can appear softer at times, but it also appears more true/accurate to the source, whereas x264 appears more fake/processed. It's never a stark differences, but it's little things, especially as bitrates increase into broadcast ranges.

    I always tweak both MC and x264 settings, starting from my custom templates, and attuning to the sources.

    I actually do use x264 for encoding VHS quite a bit these days, but it's only because of using Hybrid with Avisynth inclusive. It's a huge time saver. So by the time the x264 encoder gets the video, it's been processed by Avisynth anyway (always QTGMC, others as needed).

    Only when not processing, or when needing other/specific profiles, do I use MainConcept.

    I'm not for or against x264, I just realize the strengths and weaknesses, and use as needed. It's just a tool for encoding. Sometimes it's the best tool, sometimes not.
    Want my help? Ask here! (not via PM!)
    FAQs: Best Blank DiscsBest TBCsBest VCRs for captureRestore VHS
    Quote Quote  
  17. Originally Posted by KarMa View Post
    I know you can force Youtube to encode your videos with VP9 if you do 1440p or higher uploads. Maybe they treat 3D in a similar fashion, if they assume your content is 3D and might treat it different. Are all of these x264 videos of yours 3D?
    Yes, I get VP9 option now, on practically all of them now, even 720p uploads

    I'm looking, but I can't find any with the "x264" tag , except 3D ones. So 3D might have something do to with it . I also checked around the same time period as the confirmed 3D uploads in case they were phasing in/out during that time. I'm still looking
    Quote Quote  
  18. Originally Posted by poisondeathray View Post
    What are you looking at ?

    The quality is significantly worse for Intel AVC, too much details and textures blurred away.

    But it's not comparable for a direct comparison, the bitrate is too low for Intel AVC
    What are you looking at? I redid the Intel encodes to more closely approximate the x264 CRF one and while the Intel encoder did undershoot the target bit rate it was relatively close.

    And I have already said that the I believed the Intel AVC encoder was broken on Linux but it still surprised me in this test, to me the x264 and both Intel AVC and Intel HEVC look very similar to me, I defy you to point out any glaring differences that swing the test in x264's favor in any meaningful way.
    Quote Quote  
  19. If anyone has a Coffee Lake based CPU or a Navi based GPU and wants to add their encodes feel free to do it.
    Quote Quote  
  20. What for? Just to hear that you cannot see it because it is too sunny, or you are on the laptop, or you cannot see it.

    Then you initiate a switcheroo and start to talk about something else.
    Quote Quote  
  21. Originally Posted by lordsmurf View Post
    You focus on detail almost 100%, and that's fine. But there's more to encoding than detail. (And detail isn't always detail anyway, but rather noise that is perceived as detail.)
    Of course there is more to encoding,

    I emphasize it because this is a major point that separates encoders. You can adjust a good encoder to oversmooth, but you can't adjust a weak encoder to retain actual details (at some given bitrate)

    If an encoder drops entire objects , eyes, buttons, rocks etc... those are actual details dropped and poor results . Obviously that's not enough bitrate for that encoder in that scenario

    "Noise" is part of the original signal. Encoder A that preserves that noise at a given bitrate is a technically better encoder than Encoder B that does not

    When shadows are mushy devoid of noise and detail, but the source had them, that's quite dissimilar

    But if the noise pattern is slightly different (x264) , then that too is dissimilar .

    I'd rather keep the actual details, and slightly more or different noise or grain pattern, than drop real details like objects and textures . If I wanted something smooth I'd preprocess and denoise it properly , you 'd get better results than letting an encoder denoise for you


    What is "high definition" in layman's terms ? Details. Small visible details like vellus hair, pores, fine fabric patterns. Clear individual hair strands and frizz, grass blades instead of blobby mush. That's what separates high vs. low quality. High frequency detail. If you have a real HD or UHD source vs. a SD upscale, how would you describe the difference ? The high frequency details. They are missing in the upscale, which is smooth, devoid of details . Fine details are the first thing to go with low quality encoders
    Quote Quote  
  22. Originally Posted by sophisticles View Post
    And I have already said that the I believed the Intel AVC encoder was broken on Linux but it still surprised me in this test, to me the x264 and both Intel AVC and Intel HEVC look very similar to me, I defy you to point out any glaring differences that swing the test in x264's favor in any meaningful way.


    Compare it to the original . Look at the missing details, like clothing textures, hair, stubble, even the noise (the source has that lowlight small sensor noise BMPCC is known for)

    Both are softer, but Intel AVC more so

    It's not a proper direct comparison, x264 is using more bitrate . Acceptable is usually +/-0.5% difference


    The Intel AVC 2 encode doesn't look "broken" , it just looks softer, and less similar to the source. The b-frames are noticably lower in quality, visibly softer than p's. I'm wondering if it was like that or you screwed with the settings ?
    Quote Quote  
  23. Originally Posted by sophisticles View Post
    This is the type of crap that drives me absolutely crazy

    I am not a mind reader and I do not know what settings "people would typically use", I don't even know what that means.
    That's a lie.

    Originally Posted by sophisticles View Post
    Just to prove you guys wrong, I also did more tests, x264+crf 23+tune film+medium, which I'm sure you will agree is closer to "what most people would use"
    You do know. Or was it just a lucky guess?

    Originally Posted by sophisticles View Post
    @hello_hello: You consider 10 mb/s for 3840x1600 4k to be too much bit rate? This is where the x264 faithful and I differ. The Blu-Ray spec allows for up to 54 mb/s for video which can be a maximum of 1920x1080p60, UHD BD allows for up to 144 mb/s which can be a maximum of 3840x1600p60, yet using greater than 5 times less bit rate than BD for a resolution that is 4 times bigger than BD is somehow considered "a decent amount".
    Wow, you certainly confabulated a lot from my statement that encoders should produce a decent encode at a decent bitrate.
    You didn't include the source, so I just assumed the quality was pretty high each time. BTW, your source is one third the frame rate of the maximum frame rates you quoted.

    Originally Posted by sophisticles View Post
    People complain about the quality of Blu-Ray and UHD BD all the time and you guys want to use less than 10 mb/s for 4k?
    The complaints don't necessarily have anything to do with the quality of the encoder. Do you have an example of a complaint about encoding quality?

    Originally Posted by sophisticles View Post
    So in the interest of fairness i did Intel AVC and Intel HEVC encode at that bit rate. You will notice that the Intel encoder used less bit rate than x264 and still managed to match the quality of the vastly over-rated x264.
    Go ahead and tell me how x264 is vastly superior to all other encoders
    Vastly superior?? Who said that?

    I'm not actually sure what we're supposed to be comparing any more, as your comparisons are never between apples and apples, but it's obvious the "BMPCC_Food_Blogger x264 cheat settings" encode from your last post has slightly more "fine" detail than the "BMPCC_Food_Blogger Intel AVC VA-API 1" encode from your previous post, and I'm viewing them both scaled down to 1080p on a 1080p monitor.

    For the encodes with twice the bitrate, I can see the x264 encode has been encoded differently compared to the "cheat" encode", but I can't say that's because of the extra bitrate or simply because you can't even manage to use the same encoder settings twice, but doubling the bitrate didn't help the new "BMPCC_Food_Blogger Intel AVC" encode retain as much fine detail as either of the x264 encodes. All screenshots are only part of the frame so as not to resize when viewing them on my monitor. Hover over the thumbnails for the tooltip and the names of the screenshots as they're not labelled, although the file sizes might also give you a hint. I can even see the difference on my CRT.

    If the Intel AVC encoder is broken on Linux but looks very similar to the Intel HEVC encoder, what does it say about the latter?
    Image Attached Thumbnails Click image for larger version

Name:	BMPCC_Food_Blogger x264 cheat settings.png
Views:	165
Size:	2.05 MB
ID:	53090  

    Click image for larger version

Name:	BMPCC_Food_Blogger x264 CRF.png
Views:	208
Size:	2.10 MB
ID:	53091  

    Click image for larger version

Name:	BMPCC_Food_Blogger Intel AVC VA-API 1.png
Views:	148
Size:	2.00 MB
ID:	53092  

    Click image for larger version

Name:	BMPCC_Food_Blogger Intel AVC.png
Views:	162
Size:	2.06 MB
ID:	53093  

    Last edited by hello_hello; 4th May 2020 at 04:33.
    Quote Quote  
  24. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    Originally Posted by lordsmurf View Post
    Originally Posted by KarMa View Post
    Which then begs the obvious question, why would Youtube limit x264 usage to only their top viewed content. It's almost like Google thinks x264 is better than their hardware encoding setup, weird.
    Good question, but the supposed answer is a leap. The main reason x264 is used is due to licensing costs, period.
    Interesting stance that Google can't afford the best AVC encoder, when their entire business model depends on efficiency. I'd think the owner of this mystical hidden AVC encoder would happily make a deal with Google. We are talking about a company that singlehandly developed VP9 on there own, and were the linchpin behind the development of VP10 AV1. If you want to start throwing in AVC samples to back up your claims I'm happy to contribute. Otherwise you are just walking down the same convenient unanswerable logic of sophisticles, period.
    Quote Quote  
  25. Video Restorer lordsmurf's Avatar
    Join Date
    Jun 2003
    Location
    dFAQ.us/lordsmurf
    Search Comp PM
    Originally Posted by KarMa View Post
    Originally Posted by lordsmurf View Post
    Originally Posted by KarMa View Post
    Which then begs the obvious question, why would Youtube limit x264 usage to only their top viewed content. It's almost like Google thinks x264 is better than their hardware encoding setup, weird.
    Good question, but the supposed answer is a leap. The main reason x264 is used is due to licensing costs, period.
    Interesting stance that Google can't afford
    It has nothing to do with the ability to afford.
    It has everything to do with willingness to pay money.

    Google would rather in-house develop something than pay licensing fees. I can't necessarily blame them. But that can run afoul of industry standards (what everybody else decides to do). It goes far beyond x264, or even video. With the non-264, Google lost. So when it came to using H.264, Google is using the freeware. Google further fueded with MPEG-LA for years over licensing costs.

    I've not followed formats bickering for several years now, but it's always all about money and control.
    Want my help? Ask here! (not via PM!)
    FAQs: Best Blank DiscsBest TBCsBest VCRs for captureRestore VHS
    Quote Quote  
  26. Originally Posted by lordsmurf View Post
    Originally Posted by KarMa View Post
    Originally Posted by lordsmurf View Post
    Originally Posted by KarMa View Post
    Which then begs the obvious question, why would Youtube limit x264 usage to only their top viewed content. It's almost like Google thinks x264 is better than their hardware encoding setup, weird.
    Good question, but the supposed answer is a leap. The main reason x264 is used is due to licensing costs, period.
    Interesting stance that Google can't afford
    It has nothing to do with the ability to afford.
    It has everything to do with willingness to pay money.

    Google would rather in-house develop something than pay licensing fees. I can't necessarily blame them. But that can run afoul of industry standards (what everybody else decides to do). It goes far beyond x264, or even video. With the non-264, Google lost. So when it came to using H.264, Google is using the freeware. Google further fueded with MPEG-LA for years over licensing costs.

    I've not followed formats bickering for several years now, but it's always all about money and control.
    Youtube is not using the GPL version of x264, they are licensing it.
    X264 has a duel licens / https://x264.org/en/
    Quote Quote  
  27. Originally Posted by hello_hello View Post
    Vastly superior?? Who said that?
    You guys, the x264 faithful, nearly all he time. PDR is like a x264 cheerleader, he goes on and one how "x264 has won every test every time" and with some supposedly huge margin.

    Originally Posted by hello_hello View Post
    I'm not actually sure what we're supposed to be comparing any more, as your comparisons are never between apples and apples, but it's obvious the "BMPCC_Food_Blogger x264 cheat settings" encode from your last post has slightly more "fine" detail than the "BMPCC_Food_Blogger Intel AVC VA-API 1" encode from your previous post, and I'm viewing them both scaled down to 1080p on a 1080p monitor.
    Do you know what the sad thing is? The x264 "cheat" encode was done with 2 pass vbr, preset slow, profile high while the Intel AVC VA-API 1 is using 1 pass, because Intel's encoders do not support a 2 pass mode. So, the highly vaunted x264 encoder, with the settings cranked up managed to have slightly more fine detail than an encoder that's hobbled in the implementation in AviDemux and which I truly believe is fundamentally broken On Linux.

    Originally Posted by hello_hello View Post
    For the encodes with twice the bitrate, I can see the x264 encode has been encoded differently compared to the "cheat" encode", but I can't say that's because of the extra bitrate or simply because you can't even manage to use the same encoder settings twice, but doubling the bitrate didn't help the new "BMPCC_Food_Blogger Intel AVC" encode retain as much fine detail as either of the x264 encodes. All screenshots are only part of the frame so as not to resize when viewing them on my monitor. Hover over the thumbnails for the tooltip and the names of the screenshots as they're not labelled, although the file sizes might also give you a hint. I can even see the difference on my CRT.
    The "cheat" encode was done as a way of satisfying the people complaining about my using a 1 pass CBR for the first x264 encode, I "cheated" by using a 2 pass vbr + preset slow + profile high, which are setting that should completely blow Intel's AVC encoder out of the water, The said thing is, by your own admission, it doesn't, it simply retains slightly more fine detail. Those are your words correct?

    Originally Posted by hello_hello View Post
    If the Intel AVC encoder is broken on Linux but looks very similar to the Intel HEVC encoder, what does it say about the latter?
    I based my broken comments on tests that I have done using the Net Flix sources found on Xiph, specifically the drone clip with Handbrake, no matter what I do the Intel AVC encoder falls apart, it macroblocks badly, no matter how much bit rate I throw at it, and I have seen similar behavior with other clips.

    It could be this particular source is very friendly to this encoder, which happens time to time.

    Later this week I will have some free time, I am going to compile ffmepg with QSV support, not vaapi, which allows the Intel encoder to show what's it's capable of and I will revisit this test.
    Quote Quote  
  28. Originally Posted by sophisticles View Post

    You guys, the x264 faithful, nearly all he time. PDR is like a x264 cheerleader, he goes on and one how "x264 has won every test every time" and with some supposedly huge margin.

    Later this week I will have some free time, I am going to compile ffmepg with QSV support, not vaapi, which allows the Intel encoder to show what's it's capable of and I will revisit this test.
    I said it won almost every test back in the period when it dominated, or came near the top . How is that overrated ?

    So finish the test. What bitrate is required by Intel to come close to x264 using those settings ? is it 1.2x , 1.5x? What is the margin in that test today, using those settings ? Is an encoder that requires 1.2x more bitrate significant? 1.5x ? It probably depends on the person and the situation

    Or if you want to put it another way, how low can you drop x264's bitrate to match Intel's quality ?

    Then add some other tests. Use some other settings like very slow, more b-frames. Same for Intel , it might not be using ideal settings as the b-frames look noticably worse

    You can add some PSNR runs too (using PSNR settings) if that's important to you. Do multiple encodes for PSNR, then plot the RD curves . Same for SSIM.
    That's what intel uses to justify the encoder in the marketing materials (PSNR)

    Then check some other sources, cartoons, home video, sports, drone videos etc...



    The "cheat" encode was done as a way of satisfying the people complaining about my using a 1 pass CBR for the first x264 encode
    Don't be silly

    How is that "complaining" ? Does any encoder test in the past 20 years use CBR ? Does any MPEG2 test ?

    2pass is always used. CRF is only allowed because it gives similar quality at equivalent bitrates to 2pass
    Quote Quote  
  29. Originally Posted by poisondeathray View Post

    How is that "complaining" ? Does any encoder test in the past 20 years use CBR ? Does any MPEG2 test ?

    2pass is always used. CRF is only allowed because it gives similar quality at equivalent bitrates to 2pass
    Yes, CBR was used extensively back in the MPEG-2 days and I think many of the issue people have with quality is because they moved away from CBR encoding. Think about what VBR does, in theory, it uses less bit rate in areas that in theory "need" less bit rate so that it can be used in areas that "need" more bit rate. But this is idiotic, because encoders use higher quantizers for P frames than I frames and higher still quantizers for B frame than P frames, with the theory that somehow the I frame become higher quality references for the other frames. When this didn't work, they started doing 2 and 3 pass vbr, as a way of evening out the bit rate fluctuations and quality. They could have just stuck with CBR and avoided all the problem.

    But that wasn't enough, some snot nosed little punk, that didn't even have his comp sci degree yet decided he was going to crap all over the video scene and took over the x264 project and introduced that absurd CRF mode, which is nothing more than a Frankenstein marriage of single pass ABR with some QP pixie dust sprinkled on top and the lemmings went gaga over it.

    Best part is that those that sweat by CRF are the same people that swear that a computer can't be trusted to determine quality, "only your eyes can be trusted", but when someone's eyes disagree with theirs then there's something wrong with hat person's setup or eyes or brain or whatever else, but at the same time the same people that don't trust PSNR, SSIM, VMAF, or anything else have no problem putting their faith in CRF mode, in which by definition you choose a quality and let the encoder decide. But no one can tell me how that quality is measured nor how the encoder goes about determining if it's achieving that quality.

    Talk about a bunch of BS.
    Quote Quote  
  30. Originally Posted by sophisticles View Post
    Originally Posted by poisondeathray View Post

    How is that "complaining" ? Does any encoder test in the past 20 years use CBR ? Does any MPEG2 test ?

    2pass is always used. CRF is only allowed because it gives similar quality at equivalent bitrates to 2pass
    Yes, CBR was used extensively back in the MPEG-2 days and I think many of the issue people have with quality is because they moved away from CBR encoding. Think about what VBR does, in theory, it uses less bit rate in areas that in theory "need" less bit rate so that it can be used in areas that "need" more bit rate. But this is idiotic, because encoders use higher quantizers for P frames than I frames and higher still quantizers for B frame than P frames, with the theory that somehow the I frame become higher quality references for the other frames. When this didn't work, they started doing 2 and 3 pass vbr, as a way of evening out the bit rate fluctuations and quality. They could have just stuck with CBR and avoided all the problem.

    But that wasn't enough, some snot nosed little punk, that didn't even have his comp sci degree yet decided he was going to crap all over the video scene and took over the x264 project and introduced that absurd CRF mode, which is nothing more than a Frankenstein marriage of single pass ABR with some QP pixie dust sprinkled on top and the lemmings went gaga over it.

    BS. Don't be absurd

    What retail DVD uses CBR encoding ? What Blu-ray uses CBR encoding ? What internet delivery stream uses CBR encoding ?

    CBR is less efficient than VBR. True or False ?

    When we're talking about compression efficiency tests - you never CBR or true CBR (adding stuffing packets)




    Best part is that those that sweat by CRF are the same people that swear that a computer can't be trusted to determine quality, "only your eyes can be trusted", but when someone's eyes disagree with theirs then there's something wrong with hat person's setup or eyes or brain or whatever else, but at the same time the same people that don't trust PSNR, SSIM, VMAF, or anything else have no problem putting their faith in CRF mode, in which by definition you choose a quality and let the encoder decide. But no one can tell me how that quality is measured nor how the encoder goes about determining if it's achieving that quality.

    Talk about a bunch of BS.

    You're demonstrating your ignorance again

    CRF is a rate control method. That's it.

    PSNR, SSIM, VMAF are quality metrics

    CRF supposedly gives similar "quality" at the same bitrate as 2 pass. They use similar algorithms. That's what the developers claim. (It appears to be true, or very close). Use 2 pass if want to. That's what I do for these comparisons. 2pass is always a valid method of rate control.

    CBR, however is invalid rate control method, because it's inefficient. That's the crux of the problem and why I objected. This is true for any encoder. It's not used for lossy encoding scenarios. It's not applicable and has limited relevance. If you used Mainconcept or encoder x,y,z in CBR mode, I would complain too. But you're making this only about x264, which is silly

    These are simple concepts , you know this.

    Get over your little grudge already.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!