VideoHelp Forum
+ Reply to Thread
Results 1 to 13 of 13
Thread
  1. Member
    Join Date: Apr 2014
    Location: Pawtucket, RI
    Search PM
    Any number of applications talk about two pass encoding for best quality at a a target size. I want to find out more about the process. What is the typical way of doing this?

    The AMD VCE H.264 hardware based encoder does not provide native support for this, and offers just a one pass option. How hard would it be for someone to write their own front end to use what is provided by AMD in two passes?

    I can only guess that you'd have to have some way of reading the file and storing information about where compression could be most effectively made and then somehow breaking the file down into a bunch of parts on a second pass, varying the bit rate to reflect this information.

    Help me stop guessing and tell me what AMD would have done differently to make this as effortless / effective as possible.
    Quote Quote  
  2. The following is the x264GUI in the video editor AviUtl, which has implemented encoding by setting a filesize and bitrate limit.
    Click image for larger version

Name:	VH-20140429-x264FileSizeConstrain.png
Views:	26
Size:	123.1 KB
ID:	24872

    Probably other encoder GUIs may do this also.
    However, I did not notice any two-pass option for QSV or VCE encoder
    Quote Quote  
  3. If the encoder doesn't support 2-pass (or multipass, more than two is possible too) encoding you can't use it. You need to use a encoder and front end that support 2-pass encoding. I don't think any of the GPU encoders supports 2-pass encoding. If you have quad core or better CPU you'll find that x264 at the veryfast preset delivers better quality and is faster than any of the free GPU encoders.
    Quote Quote  
  4. Member
    Join Date: Oct 2004
    Location: Freedonia
    Search Comp PM
    Your problem is that you are trying to have your cake and eat it too and you don't realize this.

    You can encode to H.264 fast OR you can encode with the best quality. Choose ONE. Which do you want?

    Nobody seriously thinks that the hardware based encoders are the best for quality. They are the best for speed. That's probably why they only offer one pass - nobody who is really serious about quality would use those encoders. Hardware encoders are intended for the younger generation that thinks if anything takes longer than 1 minute then pieces of their very soul have been taken from them.
    Quote Quote  
  5. Originally Posted by jman98 View Post
    You can encode to H.264 fast OR you can encode with the best quality. Choose ONE.
    No you can choose any two of these three:

    1) fast
    2) high quality
    3) small file
    Quote Quote  
  6. Member
    Join Date: Apr 2014
    Location: Pawtucket, RI
    Search PM
    jagabo: I'm looking for best quality at a given file size. I'm trying to (and succeeding) at making my own superbit backup copies from high def sources. I'm writing UDF 2.50 discs on 8.5gb media, and the quality doesn't suck at all.

    I don't see why it should be impossible for hardware manufacturers to write tools that make it easier for other products, like from VideoReDo, VSO Media Converter, and others to get good results WITH dedicated hardware. Otherwise what's the point of that hardware?
    Quote Quote  
  7. Banned
    Join Date: Nov 2005
    Location: United States
    Search Comp PM
    i see the BS is still being flung far and wide. it's true VCE does not offer 2 pass encoding, there are CUDA encoder implementations that do and the thing that is most annoying is i mentioned in the 2 other gpu encoding related threads, yet you guys insist on lying to people.

    furthermore i have shown repeatedly that encoding quality is primarily influenced by the quality of the source material and the bit rate used, with a high enough source and enough bit rate all encoders, software or hardware based, can give excellent results.

    stop lying to people, by now you guys know better, it's stupid beyond belief to say things like "x264+very fast is both faster and better quality than gpu encoding", that's pure grade A BULLoney and you know it!
    Quote Quote  
  8. Banned
    Join Date: Nov 2005
    Location: United States
    Search Comp PM
    Originally Posted by lasitter View Post
    I don't see why it should be impossible for hardware manufacturers to write tools that make it easier for other products, like from VideoReDo, VSO Media Converter, and others to get good results WITH dedicated hardware. Otherwise what's the point of that hardware?
    it's not impossible, unfortunately the hardware makers, nvidia, amd, intel, make the hardware and then make an SDK available to developers to allow them to leverage the hardware.

    the thing is for general desktop use, those that write the software seem to be loath to embrace the hardware capabilities for a number of reasons:

    1) they don't want to limit the number of potential users of their software to just the users with supporting hardware.

    2) they don't want to learn a new programming technique.

    3) there little to no financial incentive.

    4) they're just plan stupid.
    Quote Quote  
  9. Member
    Join Date: Apr 2014
    Location: Pawtucket, RI
    Search PM
    Originally Posted by deadrats View Post

    3) there little to no financial incentive.

    4) they're just plan stupid.
    Imagine for a moment that we're living in a fairy tale and that somehow there was an incentive to use what AMD / ATI has provided towards this end. What would a programmer / developer have to do just to come up with an "Alpha" version to try out?
    Quote Quote  
  10. Member
    Join Date: Oct 2004
    Location: Freedonia
    Search Comp PM
    Originally Posted by deadrats View Post
    furthermore i have shown repeatedly that encoding quality is primarily influenced by the quality of the source material and the bit rate used, with a high enough source and enough bit rate all encoders, software or hardware based, can give excellent results.
    The only person who is lying here is you. If it please the court of public opinion, I offer the following as People's Exhibit A:
    http://forum.videohelp.com/threads/357361-proof-that-x264-HD-is-a-poor-benchmark

    Conclusive proof above that you have not "shown repeatedly" that "all encoders" "can give excellent results".

    Dude, your venom against X.264, even if you do have a point, is so out of control that I really have begun to question your sanity and why you care so much about a free program. Nobody is lying except maybe you in trying to misrepresent statements you've made in the past. If you don't agree with what's said, express a coherent counter argument rather than resorting to juvenile "Neener! Neener! Liar liar! Pants on fire!" type responses.
    Quote Quote  
  11. Originally Posted by deadrats View Post
    i see the BS is still being flung far and wide. it's true VCE does not offer 2 pass encoding, there are CUDA encoder implementations that do and the thing that is most annoying is i mentioned in the 2 other gpu encoding related threads, yet you guys insist on lying to people.

    furthermore i have shown repeatedly that encoding quality is primarily influenced by the quality of the source material and the bit rate used, with a high enough source and enough bit rate all encoders, software or hardware based, can give excellent results.

    stop lying to people, by now you guys know better, it's stupid beyond belief to say things like "x264+very fast is both faster and better quality than gpu encoding", that's pure grade A BULLoney and you know it!
    Here's deadrat's recent "comprehensive" GPU encoding thread:
    http://forum.videohelp.com/threads/363900-deadrats-comprehensive-gpu-encoding-tests?p=2315950

    I provided an x264 encoding that looked better, had a slightly lower bitrate, and encoded 15x faster than his GPU encode.

    http://forum.videohelp.com/threads/363900-deadrats-comprehensive-gpu-encoding-tests?p=...=1#post2315927

    Of course, there was something wrong with his Mediacoder setup that made it much slower than it should have been. Others (including me) duplicated his Mediacoder setting and got much faster encodes. But still slower and lower quality than the x264 encode I made.

    Here's the other thread where deadrats claimed to have proved the superiority of CUDA

    http://forum.videohelp.com/threads/363757-Affordable-transcode-options-with-OpenCL-GPU...=1#post2315284

    I duplicated his processing encoding as best I could. I didn't bother uploading my result because there are too many unknowns in his processing (we don't know what resizing filter Sony used, why it pillarboxed the video, how long they took, etc.) and he had moved on to his "comprehensive" thread. Of his samples I only downloaded Sony AVC GPU.mp4. That looked better than the Medicoder sample in the other thread, but x264 still looked better -- especially during high motion sequences.
    Last edited by jagabo; 29th Apr 2014 at 13:08.
    Quote Quote  
  12. I wouldn't take anything deadrats says too seriously. Not seriously at all would be the best option.
    He keeps thrashing the "hardware encoder" thing in threads but hasn't offered the slightest bit of evidence to support most of his claims, then when others offer evidence to the contrary, he accuses them of lying. He seems to have an astounding ability to ignore reality.

    furthermore i have shown repeatedly that encoding quality is primarily influenced by the quality of the source material
    How funny is that? If the source is a high quality the encode will look better than if the source is of a low quality. Well I never......

    and the bit rate used, with a high enough source and enough bit rate all encoders, software or hardware based, can give excellent results.
    And if you use a high enough bitrate you can make a poor quality encoder seem to match the output quality of a better encoder.

    stop lying to people, by now you guys know better, it's stupid beyond belief to say things like "x264+very fast is both faster and better quality than gpu encoding", that's pure grade A BULLoney and you know it!
    Name:  hardware encoding.jpg
Views: 166
Size:  24.4 KB
    Quote Quote  
  13. Member Cornucopia's Avatar
    Join Date: Oct 2001
    Location: Deep in the Heart of Texas
    Search Comp PM
    I see the difference between HW (or HW-assisted) encoders and SW encoders as usually one which uses integer-type calculation shortcuts vs. one which doesn't. And using Integer has speed benefits and quality (accuracy) drawbacks. But they both can be anywhere from awful to amazing.

    Scott
    "When will the rhetorical questions end?!" - George Carlin
    Quote Quote