Hi maybe I'm are total wrong but that are what I are come to think as I read and se when I test my self but just tell me if I maybe do a thing wrong
whit the gpu I have test intel quick sync and nvidia Cuda dont have a AMD gpu now and best we talk quality
especially in dark areas I think it not that good on whit help from GPU and we talk H.264 hope you can read it my whiting are not that good even in my own languages
Try StreamFab Downloader and download from Netflix, Amazon, Youtube! Or Try DVDFab and copy Blu-rays! or rip iTunes movies!
+ Reply to Thread
Results 1 to 16 of 16
Thread
-
-
-
All the free tools use the encoders provided by the hardware manufacturers (AMD, Nvidia, Intel). The manufacturers want to impress you with speed, and don't want to spend a lot of development time ($) on the encoders. So you get encoders that do the bare minimum to produce adequate quality (for the average viewer) but encode quickly.
I've analyzed some Quick Sync encodings and find that it finds far fewer motion vectors than x264. I think the hardware encoders try to make up for that by blurring away noise and small, low contrast, details. -
Development time and support.
In general, a tool which only works with a specific gpu type or chipset, will have a smaller number of users. A tool which uses cpu, will work on most any computer and thus have a much larger number of users. It's more cost efficient to support an app with a large user base and one with a small user base. So the tool will have better support and fixes based on feedback.Google is your Friend -
-
-
-
Jagabo's statement could be misconstrued as him saying that hardware encoders sacrifice fine details in the pursuit of compression efficiency and that somehow this doesn't apply to software based encoders. The truth is that all encoders, even x264 and x265, sacrifice small details in the pursuit of compression efficiency, it's the nature of the beast.
Some encoders choose the path of eliminating fine details, some blur them, but in the end the small details are still gone.
But with a sufficiently high bit rate, then yes, it is well established that as bit rate goes up the quality difference between encoders disappears.
As I have pointed out before, the most important factor in the quality of an encode is the quality of the source, if you have a very high quality source, shot with a high quality camera, lens, lighting, etc and framed graded properly, then mastered to a high quality intermediate codec, then it will make very little difference which lossy encoder you use for the final delivery format.
I have seen delivered content created with less than 6000 kb/s for 1080p encoded with Apple's H264, with no b-frames or CABAC and the encode was stunningly clear and high quality.
I'll take it one step further, more important than the encoder is the filtering chain used, there is a very well known "scene group" which takes commercial blu-rays, encodes them at 1080p with less than 1500 kb/s, 2 pass x264 + the slower setting and as much as I think that bit rate is too low I have to admit that they get very impressive results. In fact, I tried to replicate what they did, I downloaded their release of a movie that I have on blu-ray, and transcoded my copy of the movie to the same resolution and bit rate as there's, using both x264+placebo and x265+very fast and I couldn't get the quality they had achieved with at such a low bit rate. The only conclusion is that they are using some custom filters and settings to achieve their encodes, which means that the encoder used is way less important than the filtering chain used. -
It is actually very simple: to get good quality you must encode with a good codec and with a high bitrate, no exceptions!
Using different implementations, using hardware or software, playing with encoding switches is all marginal tinkering.
Some people seem to have the impression that they can get away with low bitrates and have quality as long as they keep tinkering with options, swtiches, Avisynth filters etc. All nonsense!
The simple truth is encode with a high bitrate and you get quality!
The KISS principle at work!
-
I didn't see it could be taken that way.
As I've pointed out before, I believe your premise to be wrong.
A high quality encode is one which reproduces the source accurately. It doesn't matter if it's pristine digital 1080p, or a capture from an old VCR tape, the higher the encoding quality the more closely the encoded version will resemble the original. Obviously if the picture quality of the source is higher, the higher the picture quality of the encode is likely to be, but that's just a statement of the obvious and not necessarily directly related to encoding quality.
All you appear to be doing there is defending lower quality encoders by attempting to invalidate certain sources.
Me too. I wonder if that means we both agree newpall is talking nonsense again?
Well I'd agree with it being potentially as important, and that it also points to newpall's previous post as being a ridiculous generalisation.
It's like the world of quality based encoding just passed you by, isn't it?
So I can encode movie "A" using x264 and CRF16 and the final file size might be around 3GB, and if I encode movie "B" at the same resolution with CRF16 and the final file size is 10GB, movie "B" must be higher quality? Is that how it works? Nothing to do with how hard the video might be to compress. Higher bitrate = higher quality? End of story?
And taking your argument to it's illogical conclusion, that means all encoders must be created equally? If one can produce a high quality encode at a lower bitrate than another, I assume that doesn't mean one encoder is better than the other, just that a high enough bitrate wasn't used when comparing the two? Is that how it works?
I guess applying your logic to audio there's no such thing as quality based encoding there either. It's either CBR 320kbps or it's not high quality. Is that how it works?
When I read silly generalisations like that I find myself wondering if you've ever encoded a video before. -
Given there's no exceptions, would you mind sharing the newpball approved bitrates we must adhere to in order to achieve high quality encodes for each of the standard video resolutions? ie 1080p, 720p, PAL and NTSC? The special newpball approved bitrates below which it's not possible to achieve high quality, no exceptions. Thanks.
-
For BD (using H.264) I consider the high twenties to low thirties good quality. Anything below 15 I consider work done by hackers.
For DVD it is tough because I have a hard time calling SD video "quality" regardless of the bitrate. But definitely above 7 for it to look at least a bit palatable.
For some people it is perfectly normal to take a mid 30s BD, downscale it to SD, cut the video aspect ratio "because they don't like black bars" and eventually put it on a DVD5.
And there is no shame in their actions, on the contrary, they pride themselves on the "great job" because their intermediate processing was lossless and they tinkered with encoding options so they consider themselves "quality conscious".
However I consider them hackers.
Similar Threads
-
Any benefit on a good GPU for video encoding?
By rocka in forum Video ConversionReplies: 5Last Post: 30th Jan 2013, 06:40 -
Wondershare ultimate video converter 5.7.4 cannot use GPU encoding
By jones24 in forum ComputerReplies: 0Last Post: 4th Mar 2012, 15:48 -
cpu vs gpu for video rendering with i5 460m 2.53 ghz
By Edgarke16 in forum Software PlayingReplies: 6Last Post: 3rd Jan 2012, 05:43 -
CPU and GPU Temperatures?
By neworldman in forum ComputerReplies: 15Last Post: 13th Jun 2011, 17:33 -
microsoft awarded gpu powered video encoding patent
By deadrats in forum Video ConversionReplies: 0Last Post: 30th Oct 2010, 21:44