VideoHelp Forum




+ Reply to Thread
Page 3 of 3
FirstFirst 1 2 3
Results 61 to 78 of 78
  1. Originally Posted by sophisticles View Post
    Do you know what the sad thing is? The x264 "cheat" encode was done with 2 pass vbr, preset slow, profile high while the Intel AVC VA-API 1 is using 1 pass, because Intel's encoders do not support a 2 pass mode. So, the highly vaunted x264 encoder, with the settings cranked up managed to have slightly more fine detail than an encoder that's hobbled in the implementation in AviDemux and which I truly believe is fundamentally broken On Linux.
    CRF and 2 pass encode the video the same way. 2 pass might have a slight advantage for short clips at stupidly low bitrates because it doesn't have to guess at I-frame bonus, but for the purpose of your tests you could've used 2 pass for x264 instead of CRF.
    CRF doesn't need to adjust the quality to hit the target bitrate, which 2 pass may need to do, but CRF and 2 pass are pretty interchangeable for these types of comparisons.

    x264 may have only retained some more fine detail, but you said the AVC and HEVC encodes look pretty similar (I haven't downloaded the other samples) so if that's the case does it mean an AVC encoder, not designed for 4k, is doing better than a HEVC encoder based on the h265 spec, which was designed with 4k/8k in mind.

    Originally Posted by sophisticles View Post
    The "cheat" encode was done as a way of satisfying the people complaining about my using a 1 pass CBR for the first x264 encode, I "cheated" by using a 2 pass vbr + preset slow + profile high, which are setting that should completely blow Intel's AVC encoder out of the water, The said thing is, by your own admission, it doesn't, it simply retains slightly more fine detail. Those are your words correct?
    So far, at a decent bitrate. I suspect that'll change at low bitrates. Why should x264 completely blow the others away at a decent bitrate though? Don't we all agree if you encode a clean source at a decent bitrate, any encoder should be capable of a decent quality encode? Because I assume that's what happened.
    Last edited by hello_hello; 4th May 2020 at 16:28.
    Quote Quote  
  2. Originally Posted by poisondeathray View Post
    CRF is a rate control method. That's it.

    PSNR, SSIM, VMAF are quality metrics

    CRF supposedly gives similar "quality" at the same bitrate as 2 pass. They use similar algorithms. That's what the developers claim. (It appears to be true, or very close). Use 2 pass if want to. That's what I do for these comparisons. 2pass is always a valid method of rate control.
    You enjoy demonstrating that the concept of rational, logical thought is alien to you.

    Here are beliefs you have expressed repeatedly, including in this thread:

    1) CRF gives similar quality at the same bit rate as 2 pass.

    2) A computer can't be trusted to determine quality, only your eyes can be trusted.

    3) With CRF you choose a quality and the encoder chooses a bit rate needed to achieve said quality.

    Given the above, the questions I have, which neither you, nor anyone else has ever been able to answer are:

    A) How does CRF go about determining quality?

    B) Why is that method of determining quality, which is a computer based method valid, and other computer based methods invalid?

    C) Why can't we use the same method CRF uses to determine quality as an objective metric for determining quality.

    CRF is a scam perpetrated on the gullible such as yourself and in an amazing turn of events has become gospel to people such as yourself, who will defend it to the end, despite the fact that it has been conclusively proven to be a single pass ABR mode with QP shoehorned in, nothing else.
    Quote Quote  
  3. Member
    Join Date
    Mar 2008
    Location
    United States
    Search Comp PM
    According to the Mainconcept docs , compared to CQ, , CRF takes motion into account and raises the Q's slightly for those scenes. CQ does not, so CQ mode
    works out slightly bigger. Why do you say it's a scam?
    Quote Quote  
  4. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    The answer to sophistocles' question is simple and obvious: because there is not, nor has there ever been, nor will there ever be, a single gauge of quality. There are many determiners of quality, some that are primarily statistical, some that give more or less weight to various psycho-visual priorities.
    Those that are primarily statistical are great for giving definitive numbers, but they have less connection to the HVS (human visual system).
    CRF is a numerical error adjustment, but it is weighted AFTER the HVS-based transforms have occured, so in a sense, it is hvs-based. And thus, more closely matches MOST peoples' estimations of quality levels.

    Scott
    Quote Quote  
  5. Video Restorer lordsmurf's Avatar
    Join Date
    Jun 2003
    Location
    dFAQ.us/lordsmurf
    Search Comp PM
    Originally Posted by poisondeathray View Post
    What Blu-ray uses CBR encoding ?
    BD spec MPEG2 for SD is 15mbps, usually CBR.

    Originally Posted by poisondeathray
    What retail DVD uses CBR encoding ?
    Really crappy ones from about 20 years ago, especially the dollar-bin discs.

    I also did a lot of CBR back when MPEG-2 was resource heavy to encode, and the SL (DVD-R) disc was less than an hour in length. You had plenty of headroom for 8-9mbps.

    What internet delivery stream uses CBR encoding ?
    Many actually exist, especially in years past. I was forced to use CBR and CVBR on some stream encoding, and didn't like it. But that was the studio spec for some of the streams. There were reasons, though not overly good ones. I also didn't like the choice of resolution, bitrate, or format, for many streams. Much of my work from a decade ago was not future-proofed very well because of this.

    CBR is less efficient than VBR. True or False ?
    In terms of quality, false. CBR is just fixed. You can specify a high level, and VBR can't beat it. VBR just allocates bits better, but at some point high bitrate is just high bitrate.

    Originally Posted by sophisticles View Post
    Yes, CBR was used extensively back in the MPEG-2 days and I think many of the issue people have with quality is because they moved away from CBR encoding. Think about what VBR does, in theory, it uses less bit rate in areas that in theory "need" less bit rate so that it can be used in areas that "need" more bit rate. But this is idiotic, because encoders use higher quantizers for P frames than I frames and higher still quantizers for B frame than P frames, with the theory that somehow the I frame become higher quality references for the other frames. When this didn't work, they started doing 2 and 3 pass vbr, as a way of evening out the bit rate fluctuations and quality. They could have just stuck with CBR and avoided all the problem.

    But that wasn't enough, some snot nosed little punk, that didn't even have his comp sci degree yet decided he was going to crap all over the video scene and took over the x264 project and introduced that absurd CRF mode, which is nothing more than a Frankenstein marriage of single pass ABR with some QP pixie dust sprinkled on top and the lemmings went gaga over it.

    Best part is that those that sweat by CRF are the same people that swear that a computer can't be trusted to determine quality, "only your eyes can be trusted", but when someone's eyes disagree with theirs then there's something wrong with hat person's setup or eyes or brain or whatever else, but at the same time the same people that don't trust PSNR, SSIM, VMAF, or anything else have no problem putting their faith in CRF mode, in which by definition you choose a quality and let the encoder decide. But no one can tell me how that quality is measured nor how the encoder goes about determining if it's achieving that quality.

    Talk about a bunch of BS.
    I actually agree with some of this.

    Some encoders trip all over the GOP structure, and the encodes look like they're "breathing". You can literally see the image degrade frame-by-frame until the next I-frame. Those are generally the crappiest encoders, mostly freeware, or Chinaware based on freeware.

    CRF mode was fairly lousy until more recent years. I do find myself using it now, with the latest x264, using Hybrid.

    As a human, I have eyes. I don't calculate visual data as numbers. So I've never been overly fond of arguments that use PSNR/whatever, whether or not it agrees with what I'm seeing. Computers are stupid. Humans program computers. The boxes only know what we instruct them to know. If I see X, and the computers insists it's Y, then the computer is a moron. Which is common. I get continuously tired of people who say "the computer says Y, so it must be Y" (be it store clerks, or whatever, not just video). For example, I remember the register/computer at a store would not add numbers properly last year. It took about 20 minutes of arguing with THREE employees before the error was confirmed. The manager finally got her phone (calculator app) to confirm what I did in my head in a few seconds.

    Some of the x264 developers, back in the 2000s, were indeed snot-nosed little punks/dicks. That attitude helped them in the underground/torrent community, but did nothing for them in professionals circles. But it's been relatively mature for years now.

    Originally Posted by Cornucopia View Post
    The answer to sophistocles' question is simple and obvious: because there is not, nor has there ever been, nor will there ever be, a single gauge of quality. There are many determiners of quality, some that are primarily statistical, some that give more or less weight to various psycho-visual priorities.
    Those that are primarily statistical are great for giving definitive numbers, but they have less connection to the HVS (human visual system).
    CRF is a numerical error adjustment, but it is weighted AFTER the HVS-based transforms have occured, so in a sense, it is hvs-based. And thus, more closely matches MOST peoples' estimations of quality levels.

    Scott
    Great post.
    Last edited by lordsmurf; 4th May 2020 at 21:59.
    Want my help? Ask here! (not via PM!)
    FAQs: Best Blank DiscsBest TBCsBest VCRs for captureRestore VHS
    Quote Quote  
  6. Originally Posted by sophisticles View Post
    Here are beliefs you have expressed repeatedly, including in this thread:

    1) CRF gives similar quality at the same bit rate as 2 pass.
    Yes

    2) A computer can't be trusted to determine quality, only your eyes can be trusted.
    No. You take into account multiple methods, including a computer metric. But you should place less weight in some metrics, because of lower correlation to human perception

    3) With CRF you choose a quality and the encoder chooses a bit rate needed to achieve said quality.

    No , CRF is a rate control method only. It's not a measure of quality.



    A) How does CRF go about determining quality?
    It does not. It's only a rate control method. The "quality" you end up with is approximately the same quality as 2pass at the same bitrate (as measured by computer, by eyes, by x,y,z, method) , because the underlying algorithm as 2pass

    B) Why is that method of determining quality, which is a computer based method valid, and other computer based methods invalid?
    It's not a measure of quality. It's only a rate control method.

    In contrast, SSIM, PSNR, VMAF, etc... are measures of quality each with pros/cons. I never said there were not valid, only that they have limitations and have to be interpreted in context. I still use them

    C) Why can't we use the same method CRF uses to determine quality as an objective metric for determining quality.
    See above

    CRF is a scam perpetrated on the gullible such as yourself and in an amazing turn of events has become gospel to people such as yourself, who will defend it to the end, despite the fact that it has been conclusively proven to be a single pass ABR mode with QP shoehorned in, nothing else.
    Only because you don't understand what it is
    Quote Quote  
  7. Originally Posted by sophisticles View Post
    1) CRF gives similar quality at the same bit rate as 2 pass.

    2) A computer can't be trusted to determine quality, only your eyes can be trusted.

    3) With CRF you choose a quality and the encoder chooses a bit rate needed to achieve said quality.

    Given the above, the questions I have, which neither you, nor anyone else has ever been able to answer are:

    A) How does CRF go about determining quality?

    B) Why is that method of determining quality, which is a computer based method valid, and other computer based methods invalid?

    C) Why can't we use the same method CRF uses to determine quality as an objective metric for determining quality.
    We've been through this before. My old summary of x264's rate control methods:
    https://forum.videohelp.com/threads/381668-would-it-make-more-sense-to-use-1-pass-enco...65#post2470457

    The link for the full version in the post no longer works, but you can find it here:
    https://code.videolan.org/videolan/x264/-/blob/master/doc/ratecontrol.txt

    Originally Posted by sophisticles View Post
    CRF is a scam perpetrated on the gullible such as yourself and in an amazing turn of events has become gospel to people such as yourself, who will defend it to the end, despite the fact that it has been conclusively proven to be a single pass ABR mode with QP shoehorned in, nothing else.
    I'm not an x264 expert, but it sounds like you made that up. How can it be single pass ABR when there's no way to specify a bitrate?
    My understanding is it's more like a QP encode, where you pick the QP to determine the quality, and the bitrate will be what it'll be, except CRF aims to give you the same visual quality as an equivalent QP encode, only at a lower bitrate by increasing the Q where there's lots of motion, and/or employing whatever other cleverness it's capable of.
    As I said, I'm far from an expert, but isn't that the sort of thing 2 pass encoding was used for in the past, aside from controlling the file size? How is it a bad thing if CRF can do much the same in respect to distributing quality and spending the bits wisely, but without the first pass? Or are you saying x264's 2 pass encoding shouldn't try to spend the bits wisely either?

    You must have forgotten the pesky evidence I posted in one of your threads, where you'd tried to show the Nvidia encoder compares favourably to x264. I uploaded pics showing how similarly x264 distributed the bits between each gop for CRF and 2 pass, and how x264's ABR distributed them differently. If I remember correctly, ABR didn't even distribute the quality well within each frame for that test.
    https://forum.videohelp.com/threads/371187-Testing-NVENC-with-the-GTX-960#post2384837
    Maybe you didn't see anything I posted in that thread though, given shortly after starting it you inexplicably vanished.
    Last edited by hello_hello; 5th May 2020 at 03:34.
    Quote Quote  
  8. Originally Posted by lordsmurf View Post
    Originally Posted by poisondeathray View Post
    What Blu-ray uses CBR encoding ?
    BD spec MPEG2 for SD is 15mbps, usually CBR.

    Originally Posted by poisondeathray
    What retail DVD uses CBR encoding ?
    Really crappy ones from about 20 years ago, especially the dollar-bin discs.

    I also did a lot of CBR back when MPEG-2 was resource heavy to encode, and the SL (DVD-R) disc was less than an hour in length. You had plenty of headroom for 8-9mbps.

    What internet delivery stream uses CBR encoding ?
    Many actually exist, especially in years past. I was forced to use CBR and CVBR on some stream encoding, and didn't like it. But that was the studio spec for some of the streams. There were reasons, though not overly good ones. I also didn't like the choice of resolution, bitrate, or format, for many streams. Much of my work from a decade ago was not future-proofed very well because of this.
    ie. Some limited exceptions. Not really used today for end compression formats, or a very small %

    CBR is less efficient than VBR. True or False ?
    In terms of quality, false. CBR is just fixed. You can specify a high level, and VBR can't beat it. VBR just allocates bits better, but at some point high bitrate is just high bitrate.
    Of course VBR can beat it. VBR can allocate higher peaks that CBR does not allow for .


    And we're talking about the lossy compression context. If you had unlimited bitrates or were talking about very high bitrates you might as use lossless compression

    And in the very high, or lossless compression context, VBR is still is more efficient



    In typical bitrate ranges for internet delivery, BD, personal usage, typical DVD length (not short run time) , CBR is always going to produce worse results


    Lets say his BMPCC test around 20Mb/s . Do you think 20Mb/s CBR will produce better quality than 20Mb/s 2pass VBR ?
    Quote Quote  
  9. Video Restorer lordsmurf's Avatar
    Join Date
    Jun 2003
    Location
    dFAQ.us/lordsmurf
    Search Comp PM
    Originally Posted by poisondeathray View Post
    ie. Some limited exceptions. Not really used today for end compression formats, or a very small %
    Well, in terms of "today", it is content that stills exists, either physically or online. None are defunct formats yet.
    But yes, exceptions to the rules.

    Of course VBR can beat it. VBR can allocate higher peaks that CBR does not allow for .
    And in the very high, or lossless compression context, VBR is still is more efficient
    How? Bitrate is bitrate. If CBR saturates up to peaks, then it peaks. There isn't a peak above peak (unless the encoder is terrible, and just does whatever it wants).

    And we're talking about the lossy compression context.
    Correct, lossy.

    In typical bitrate ranges for internet delivery, BD, personal usage, typical DVD length (not short run time) , CBR is always going to produce worse results
    Again, it's all about bitrate. VBR is to conserve space (thus bandwidth). It has nothing to do with quality. Max 10mbps VBR and 10mbps CBR is visually the same at peak bitrate scenes -- or at least should be, if the encoder is obeying the settings properly. At those exact settings (example: VBR min 1, avg 5, max 10), CBR would be better at lower-bitrate scenes. CBR would however waste bits and space/bandwidth, at least 200% or more. If you gave VBR min 8, avg 9, max 10, then less so. But it'd never exceed CBR quality. To do that, the bitrate would need to exceed 10, and then it's apples to oranges (though it was already).

    Lets say his BMPCC test around 20Mb/s . Do you think 20Mb/s CBR will produce better quality than 20Mb/s 2pass VBR ?
    20mbps avg or max bitrate?
    - avg 10, VBR better (unless the avg=max, which is user error, and the encoder may get confused), otherwise max obviously higher than 10
    - max 10, no real difference in CBR and VBR

    This is where math enters video.

    I get the feeling we're arguing the same thing, but I'm doing it with more words for exacting clarification.

    So I'm posting mostly for others here, I'm fairly certain you know all this.
    Last edited by lordsmurf; 5th May 2020 at 07:38.
    Want my help? Ask here! (not via PM!)
    FAQs: Best Blank DiscsBest TBCsBest VCRs for captureRestore VHS
    Quote Quote  
  10. Originally Posted by lordsmurf View Post

    Of course VBR can beat it. VBR can allocate higher peaks that CBR does not allow for .
    And in the very high, or lossless compression context, VBR is still is more efficient
    How? Bitrate is bitrate. If CBR saturates up to peaks, then it peaks. There isn't a peak above peak (unless the encoder is terrible, and just does whatever it wants).
    A buffered peak is a peak above a peak. CBR will never go above some set constant bitrate . VBR can if it the buffer is not empty

    In typical bitrate ranges for internet delivery, BD, personal usage, typical DVD length (not short run time) , CBR is always going to produce worse results
    Again, it's all about bitrate. VBR is to conserve space (thus bandwidth). It has nothing to do with quality. Max 10mbps VBR and 10mbps CBR is visually the same at peak bitrate scenes -- or at least should be, if the encoder is obeying the settings properly. At those exact settings (example: VBR min 1, avg 5, max 10), CBR would be better at lower-bitrate scenes. CBR would however waste bits and space/bandwidth, at least 200% or more. If you gave VBR min 8, avg 9, max 10, then less so. But it'd never exceed CBR quality. To do that, the bitrate would need to exceed 10, and then it's apples to oranges (though it was already).
    We're talking about slightly different things

    You're referring to a constrained scenario , such as putting limitations on max bitrate. And/or a VBV model where you have a buffer

    So for 99.999% of retail BD, DVD 's, (not short run time) , VBR will yield higher quality , because the average DVD might be 5-7. average BD might be 20-30 . CBR will waste too many bits

    And the BD, DVD, which use a VBV model, you can still have a peak above a peak . "Max" bitrate does not mean maximum instantaneous bitrate; it's the max bitrate that can enter the buffer at a time. You can still have a buffered VBR peak that goes above , as long as the buffer is not empty

    We're talking about unconstrained scenario, such as has his bmpcc test. Yes there are peaks above the average bitrate. CBR will always yield lower quality in those scenarios.

    When you put a constraint on something, in general, quality is going to be lower.


    Lets say his BMPCC test around 20Mb/s . Do you think 20Mb/s CBR will produce better quality than 20Mb/s 2pass VBR ?
    20mbps avg or max bitrate?
    - avg 10, VBR better (unless the avg=max, which is user error, and the encoder may get confused), otherwise max obviously higher than 10
    - max 10, no real difference in CBR and VBR
    He used CRF23 tune film ,medium , not constrained, and got ~19.3 average. So there are going to be peaks. There aren't any big action scenes, so you wouldn't expect something 3-4x higher. But he has some sections that go up to ~37 . There are going to be significant differences

    We're testing compression efficiency . He's using unconstrained settings. Does it make any sense to use CBR ?
    Quote Quote  
  11. So, in the interest of full disclosure, I just had a couple of giant home made margaritas and I am lit up nicely. In trying to compile ffmpeg with qsv support I ended up just giving up and installing Manjaro, but despite it looking like I got it to work, it doesn't and in fact I managed to break AviDemux and Shotcut.

    I did get Handbrake with QSV working and so here are some new tests, these are of x264+medium+tune film+profile high+crf 24 against Intel AVC + ICQ 25+ Profile high, they ended up with a bit rate very close to one another, the difference is on this system, i3 7100 + 16 gb ddr4 2666 + NVMe source and target drives, the x264 encode took 7:59 to complete while the Intel took 1:26 to complete. Note this is with Handbrake being fed the original source from BM and cropping within the application prior to encoding. I will also be uploading some 10-bit encodes soon. To me, and maybe it's all the booze, but I can't tell the difference between these 2 encodes.
    Image Attached Files
    Quote Quote  
  12. Here's 2 more Intel encodes, this time of HEVC, I need to look into this but through Handbrake Intel only supports ICQ (roughly analogous to CRF) for AVC, for HEVC only QP is supported, which leads to vastly different bit rates used at the same numerical value. The 10-bit encodes will have to wait while I investigate what is going on.

    I also did a very low bit rate encode for Intel HEVC, just to see what it looks like.

    I think we can all agree, that when encoding speed is taken into account and the fact that Intel's encoders, problems on Linux and all, easily matching x264's quality, Intel is the clear winner in this test.
    Image Attached Files
    Quote Quote  
  13. Video Restorer lordsmurf's Avatar
    Join Date
    Jun 2003
    Location
    dFAQ.us/lordsmurf
    Search Comp PM
    Originally Posted by poisondeathray View Post
    Does it make any sense to use CBR ?
    These days, not really.
    The only reason I can see is if some old hardware appliance required CBR, and I do know that some existed.

    Originally Posted by poisondeathray View Post
    We're talking about slightly different things
    Now that that's sorted, back to reading/lurking mode.
    Want my help? Ask here! (not via PM!)
    FAQs: Best Blank DiscsBest TBCsBest VCRs for captureRestore VHS
    Quote Quote  
  14. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    Originally Posted by sophisticles View Post
    To me, and maybe it's all the booze, but I can't tell the difference between these 2 encodes.





    With the last one, the girl's dark neck area just smears like crazy on the Intel QS AVC. There are other scenes with this smearing too with QS but it's hard to show with single frames.

    Code:
    LoadPlugin(".............LSMASHSource.dll")
    A=LWLibavVideoSource(".............BMPCC_Food_Blogger x264 CRF 24.mkv").crop(960,260,-960,-260).crop(962,0,0,-542).AddBorders(2,0,0,2).subtitle(align=9, "x264", text_color=$7CFC00)
    B=LWLibavVideoSource("....................BMPCC_Food_Blogger Intel ICQ 25.mkv").crop(960,260,-960,-260).crop(0,0,-960,-542).AddBorders(0,0,0,2).subtitle(align=7, "Intel AVC")
    
    AA=LWLibavVideoSource("...............BMPCC_Food_Blogger x264 CRF 24.mkv").crop(960,260,-960,-260).crop(0,540,-960,0).AddBorders(0,0,2,0).subtitle(align=1, "x264", text_color=$7CFC00)
    BB=LWLibavVideoSource("....................BMPCC_Food_Blogger Intel ICQ 25.mkv").crop(960,260,-960,-260).crop(962,540,0,0).subtitle(align=3, "Intel AVC")
    
    
    
    N=StackHorizontal(B,A)
    S=StackHorizontal(AA,BB)
    
    
    StackVertical(N,S)
    Last edited by KarMa; 6th May 2020 at 00:15.
    Quote Quote  
  15. Originally Posted by sophisticles View Post
    So, in the interest of full disclosure, I just had a couple of giant home made margaritas and I am lit up nicely. In trying to compile ffmpeg with qsv support I ended up just giving up and installing Manjaro, but despite it looking like I got it to work, it doesn't and in fact I managed to break AviDemux and Shotcut.

    I did get Handbrake with QSV working and so here are some new tests, these are of x264+medium+tune film+profile high+crf 24 against Intel AVC + ICQ 25+ Profile high, they ended up with a bit rate very close to one another, the difference is on this system, i3 7100 + 16 gb ddr4 2666 + NVMe source and target drives, the x264 encode took 7:59 to complete while the Intel took 1:26 to complete. Note this is with Handbrake being fed the original source from BM and cropping within the application prior to encoding. I will also be uploading some 10-bit encodes soon. To me, and maybe it's all the booze, but I can't tell the difference between these 2 encodes.

    Slightly better, but there are still some significant quality differences. Intel has excessive blurring, and moderate degredation on some frames .

    Do I need to post screenshots or crops or highlights to point them out ? Or is it a waste of time ? Some of them are quite obvious, like blocky artifacts around the title screen. But it's more of the same , blurred textures like hair, clothing textures, missing objects like buttons (what's with dropping buttons and some encoders ?)

    More Intel observations:

    1) Intel has issues with motion. Whenever there is large motion (object or camera), there is additional blurring than the baseline. It's probably inaccurate motion vectors, and you can see that in some frames where the objects and textures are skewed /tilted or warped in a different direction compared to the original

    2) Intel the b-frame deterioration issue. I think this is the larger issue, and perhaps it's "fixable". I think it's not balanced or set ideally for Intel in terms of the bitrate weighting for P:B. But the Intel b-frames are quite a bit lower in quality than it's p-frames. Maybe there are some settings that you can tweak? It looks like non adaptive b-frames, it's fixed IPBBPBBPBB . Adaptive can help some encoders. And intel GOP length is 24 - is that "ideal" for Intel ? I don't know. Maybe a larger GOP size would help? OTOH, maybe more b-frames in a GOP might make it worse .

    Since (2) occurs frequently (it's fixed placement), you get lots of low quality frames. But whenever 1+2 occur together, that Intel encode produces moderate degredation - ie. not just blurred details, but blocky compression artifacts as well

    But the speed difference is nothing to sneeze at. If crf24 tune film was your "goal", what % bitrate would intel need to get similar quality ? 120% ? 140%? Run some more tests . Using more bitrate might be acceptable trade off for 3-4x faster encode - depends on the scenario and goals. OTOH, maybe you want higher compression efficiency for x264, use even slower settings, more b-frames, etc... if you can accept the slower encode times . Maybe Intel needs 160 or 180% filesize in that scenario? Do the tests and find out. But I suspect there are better Intel settings you should be using. If Handbrake can't do it, look as QSVEnc



    Some screenshots of source/intel/x264

    View them at 1:1 or 100%. e.g. in a browser and pan around if viewing on smaller display . Or do I need to crop to areas and highlight some of the differences ? It should be pretty obvious differences (but then again, you couldn't see the different between a youtube encode and one with 2-3x higher bitrate, and very slow encoding)

    0139 - moderate picture degradation and block artifacts start appearing, blocky hair artifact on both male and female, jacket textures eroded . x264 is a b frame too ( that whole title section is pretty bad for Intel)

    0463 - girls shirt lines blurred, jean jacket textures are blurred tilted at the wrong angle (bad motion vectors ?), guys' jacket folds are smoothed away, missing button (what is it with some encoders like MC, Intel and dropping buttons !? are they not important or something ?)

    2051 - hair strands are blurred away in Intel encode. It looks moved at a different angle than the original (the motion vector problem)
    Image Attached Files
    Quote Quote  
  16. Originally Posted by sophisticles View Post
    ….. To me, and maybe it's all the booze, but I can't tell the difference between these 2 encodes.
    How do you watch these? From what distance?
    Again, x264 is the clear winner IMO, with the exception of encoding speed of course.
    The ICQ is blocky, looses details, softens the picture, has more banding, produces more artefacts.

    If you are a bit familiar with Avisynth you could compare the 2 encodes using this simple script:

    Code:
    v1=DGSource("BMPCC_Food_Blogger x264 CRF 24.dgi")  #or a source filter of your choice
    v2=DGSource("BMPCC_Food_Blogger Intel ICQ 25.dgi") #or a source filter of your choice
    v1=v1.subtitle("x264")
    v2=v2.subtitle("Intel ICQ")
    Interleave(v1,v2)
    Save this as a text file with extension .avs (e.g. compare.avs). Then open this compare.avs in a player like MPC-HC and step through the pictures.

    If you still don't see any difference in your playback/viewing scenario, you may just go ahead with fast HW encodes, because the extra encoding time of x264 won't return any benefit to you, apparently. Or use x264 with some ultra fast settings.
    Last edited by Sharc; 6th May 2020 at 02:11. Reason: script added
    Quote Quote  
  17. I have an older Intel Haswell , but the quality is supposed to be worse than the newer QS generations, and some more options were introduce in later generations

    But there are "quality" presets like - best, higher, high, balanced(default), fast, faster, fastest

    At that speed, why not use "best" if that wasn't already selected ?

    You can enter custom arguments with handbrake. It looks like default quality is "2" or "higher"
    https://handbrake.fr/docs/en/latest/technical/video-qsv.html

    Did you use LA-ICQ ? Look ahead ? Or just regular ICQ ? I suspect LA-ICQ will give better results, but it will be slower the larger the lookahead

    This is what Intel says
    https://software.intel.com/en-us/articles/common-bitrate-control-methods-in-intel-media-sdk
    Recommendations:

    Lookahead (LA) is often the best quality starting point when encoding to a long term target bitrate and individual frame latency is not the first order concern -- for example, file transcodes.


    You're using 2 ref frames, you can use more, it can help with compression the expense of speed. You might as well. x264 is using 3 by default .

    b-pyramid can help some encoders, but in some it makes it worse. It's usually benefical for x264 at mid to lower bitrate ranges, since the b-frames are fairly good. Not sure about QS, but I suspect it will make it worse, because the QS b-frames are so low in quality

    I seriously doubt adjusting some of the QS options will make it slower than x264. The way I see it is because it's so much faster, you should be using slower settings
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!