VideoHelp Forum
+ Reply to Thread
Results 1 to 16 of 16
Thread
  1. Member
    Join Date
    Dec 2013
    Location
    Danmark
    Search PM
    Hi maybe I'm are total wrong but that are what I are come to think as I read and se when I test my self but just tell me if I maybe do a thing wrong
    whit the gpu I have test intel quick sync and nvidia Cuda dont have a AMD gpu now and best we talk quality
    especially in dark areas I think it not that good on whit help from GPU and we talk H.264 hope you can read it my whiting are not that good even in my own languages
    Quote Quote  
  2. best quality: CPU
    best speed: GPU (depending on CPU/GPU)
    Quote Quote  
  3. Member
    Join Date
    Dec 2013
    Location
    Danmark
    Search PM
    Originally Posted by jagabo View Post
    best quality: CPU
    best speed: GPU (depending on CPU/GPU)
    Yes but why are it that are it that way I use CPU when I encoding but it whas a more technical explanation on why it are like that
    Quote Quote  
  4. Originally Posted by tola5 View Post
    Yes but why are it that are it that way
    All the free tools use the encoders provided by the hardware manufacturers (AMD, Nvidia, Intel). The manufacturers want to impress you with speed, and don't want to spend a lot of development time ($) on the encoders. So you get encoders that do the bare minimum to produce adequate quality (for the average viewer) but encode quickly.

    I've analyzed some Quick Sync encodings and find that it finds far fewer motion vectors than x264. I think the hardware encoders try to make up for that by blurring away noise and small, low contrast, details.
    Quote Quote  
  5. Member Krispy Kritter's Avatar
    Join Date
    Jul 2003
    Location
    St Louis, MO USA
    Search Comp PM
    Development time and support.

    In general, a tool which only works with a specific gpu type or chipset, will have a smaller number of users. A tool which uses cpu, will work on most any computer and thus have a much larger number of users. It's more cost efficient to support an app with a large user base and one with a small user base. So the tool will have better support and fixes based on feedback.
    Google is your Friend
    Quote Quote  
  6. Originally Posted by jagabo View Post
    I think the hardware encoders try to make up for that by blurring away noise and small, low contrast, details.
    Every encoder does this, even x264 and x265, it's lossless encoding at some point there will be some casualties in the detail department.
    Quote Quote  
  7. Originally Posted by sophisticles View Post
    Originally Posted by jagabo View Post
    I think the hardware encoders try to make up for that by blurring away noise and small, low contrast, details.
    Every encoder does this, even x264 and x265, it's lossless encoding at some point there will be some casualties in the detail department.
    So you're claiming all encoders produce equal quality encodes for a given bitrate? I'm just trying to understand the point you're making.
    Quote Quote  
  8. Originally Posted by sophisticles View Post
    Originally Posted by jagabo View Post
    I think the hardware encoders try to make up for that by blurring away noise and small, low contrast, details.
    Every encoder does this, even x264 and x265, it's lossless encoding at some point there will be some casualties in the detail department.
    The hardware encoders do it much more.
    Quote Quote  
  9. Originally Posted by tola5 View Post
    Very simplistic explanation.
    CPU are capable to deal more efficiently with complex and at the same time less predictable code. GPU are designed to deal efficiently with complex but predictable code (less conditions, less branches etc).
    Quote Quote  
  10. Member
    Join Date
    Dec 2013
    Location
    Danmark
    Search PM
    Thanks learned a lot
    Quote Quote  
  11. Originally Posted by hello_hello View Post
    So you're claiming all encoders produce equal quality encodes for a given bitrate? I'm just trying to understand the point you're making.
    Jagabo's statement could be misconstrued as him saying that hardware encoders sacrifice fine details in the pursuit of compression efficiency and that somehow this doesn't apply to software based encoders. The truth is that all encoders, even x264 and x265, sacrifice small details in the pursuit of compression efficiency, it's the nature of the beast.

    Some encoders choose the path of eliminating fine details, some blur them, but in the end the small details are still gone.

    But with a sufficiently high bit rate, then yes, it is well established that as bit rate goes up the quality difference between encoders disappears.

    As I have pointed out before, the most important factor in the quality of an encode is the quality of the source, if you have a very high quality source, shot with a high quality camera, lens, lighting, etc and framed graded properly, then mastered to a high quality intermediate codec, then it will make very little difference which lossy encoder you use for the final delivery format.

    I have seen delivered content created with less than 6000 kb/s for 1080p encoded with Apple's H264, with no b-frames or CABAC and the encode was stunningly clear and high quality.

    I'll take it one step further, more important than the encoder is the filtering chain used, there is a very well known "scene group" which takes commercial blu-rays, encodes them at 1080p with less than 1500 kb/s, 2 pass x264 + the slower setting and as much as I think that bit rate is too low I have to admit that they get very impressive results. In fact, I tried to replicate what they did, I downloaded their release of a movie that I have on blu-ray, and transcoded my copy of the movie to the same resolution and bit rate as there's, using both x264+placebo and x265+very fast and I couldn't get the quality they had achieved with at such a low bit rate. The only conclusion is that they are using some custom filters and settings to achieve their encodes, which means that the encoder used is way less important than the filtering chain used.
    Quote Quote  
  12. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    It is actually very simple: to get good quality you must encode with a good codec and with a high bitrate, no exceptions!

    Using different implementations, using hardware or software, playing with encoding switches is all marginal tinkering.

    Some people seem to have the impression that they can get away with low bitrates and have quality as long as they keep tinkering with options, swtiches, Avisynth filters etc. All nonsense!

    The simple truth is encode with a high bitrate and you get quality!

    The KISS principle at work!

    Quote Quote  
  13. Originally Posted by sophisticles View Post
    Jagabo's statement could be misconstrued as him saying that hardware encoders sacrifice fine details in the pursuit of compression efficiency and that somehow this doesn't apply to software based encoders.
    I didn't see it could be taken that way.

    Originally Posted by sophisticles View Post
    As I have pointed out before, the most important factor in the quality of an encode is the quality of the source, if you have a very high quality source, shot with a high quality camera, lens, lighting, etc and framed graded properly, then mastered to a high quality intermediate codec, then it will make very little difference which lossy encoder you use for the final delivery format.
    As I've pointed out before, I believe your premise to be wrong.
    A high quality encode is one which reproduces the source accurately. It doesn't matter if it's pristine digital 1080p, or a capture from an old VCR tape, the higher the encoding quality the more closely the encoded version will resemble the original. Obviously if the picture quality of the source is higher, the higher the picture quality of the encode is likely to be, but that's just a statement of the obvious and not necessarily directly related to encoding quality.
    All you appear to be doing there is defending lower quality encoders by attempting to invalidate certain sources.

    Originally Posted by sophisticles View Post
    I have seen delivered content created with less than 6000 kb/s for 1080p encoded with Apple's H264, with no b-frames or CABAC and the encode was stunningly clear and high quality.
    Me too. I wonder if that means we both agree newpall is talking nonsense again?

    Originally Posted by sophisticles View Post
    The only conclusion is that they are using some custom filters and settings to achieve their encodes, which means that the encoder used is way less important than the filtering chain used.
    Well I'd agree with it being potentially as important, and that it also points to newpall's previous post as being a ridiculous generalisation.

    Originally Posted by newpball View Post
    It is actually very simple: to get good quality you must encode with a good codec and with a high bitrate, no exceptions!

    Using different implementations, using hardware or software, playing with encoding switches is all marginal tinkering.
    It's like the world of quality based encoding just passed you by, isn't it?
    So I can encode movie "A" using x264 and CRF16 and the final file size might be around 3GB, and if I encode movie "B" at the same resolution with CRF16 and the final file size is 10GB, movie "B" must be higher quality? Is that how it works? Nothing to do with how hard the video might be to compress. Higher bitrate = higher quality? End of story?

    And taking your argument to it's illogical conclusion, that means all encoders must be created equally? If one can produce a high quality encode at a lower bitrate than another, I assume that doesn't mean one encoder is better than the other, just that a high enough bitrate wasn't used when comparing the two? Is that how it works?

    I guess applying your logic to audio there's no such thing as quality based encoding there either. It's either CBR 320kbps or it's not high quality. Is that how it works?

    Originally Posted by newpball View Post
    Some people seem to have the impression that they can get away with low bitrates and have quality as long as they keep tinkering with options, swtiches, Avisynth filters etc. All nonsense!
    When I read silly generalisations like that I find myself wondering if you've ever encoded a video before.
    Quote Quote  
  14. Originally Posted by newpball View Post
    It is actually very simple: to get good quality you must encode with a good codec and with a high bitrate, no exceptions!
    Given there's no exceptions, would you mind sharing the newpball approved bitrates we must adhere to in order to achieve high quality encodes for each of the standard video resolutions? ie 1080p, 720p, PAL and NTSC? The special newpball approved bitrates below which it's not possible to achieve high quality, no exceptions. Thanks.
    Quote Quote  
  15. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    For BD (using H.264) I consider the high twenties to low thirties good quality. Anything below 15 I consider work done by hackers.

    For DVD it is tough because I have a hard time calling SD video "quality" regardless of the bitrate. But definitely above 7 for it to look at least a bit palatable.

    For some people it is perfectly normal to take a mid 30s BD, downscale it to SD, cut the video aspect ratio "because they don't like black bars" and eventually put it on a DVD5.

    And there is no shame in their actions, on the contrary, they pride themselves on the "great job" because their intermediate processing was lossless and they tinkered with encoding options so they consider themselves "quality conscious".

    However I consider them hackers.

    Quote Quote  
  16. Originally Posted by newpball View Post
    For BD (using H.264) I consider the high twenties to low thirties good quality. Anything below 15 I consider work done by hackers.

    For DVD it is tough because I have a hard time calling SD video "quality" regardless of the bitrate. But definitely above 7 for it to look at least a bit palatable.
    Comedy gold!

    Have you informed Apple all the video they sell on itunes is work done by hackers??

    Are you sure you've encoded a video before?

    That's seriously funny stuff.

    Comedy gold!
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!