I'm deeply disturbed by the results of a few quality benchmarks I'm doing to figure out the best option for archieving my video projects. why do the colors seems to shift on the 264 encoding process? Thought I'd share the results with this forum in order to understand better what's going on and how can we better control our workflow.
So i've taken the Belle Nuit HD testchart, created an uncompressed AVI on After Effects and encoded mp4 these software
I've tried my best to match the settings of the different encoders, bitrate was set at 10.000 kbps. I opened all of the mp4 files back on after effects, without any color space adjustment, and exported the timeline into a PSD.
- Adobe Media Encoder
- Sony Vegas
Here's a crop with all the results
And here's a diff with the original, plus a level filter set to white 100 so we can better compare the difference
you can download the source images, videos and the psd with all the images
As you can see Adobe Media Encoder seems to be the most faithful to the original material, but it adds some dark lines around contrasting edges. The rest of the encoders have profound color differences from the source image. And CUDA was very dissapointing indeed.. just terrible quality overal.
I was really interested in using Handbrake Hardware Accel with Intel QSV, and I think the detail definition plus the speed (goes up to 200fps) is pretty decent, however there is still this color shift across the spectrum, most noticebly on the skin tones. Has anyone found a way of increasing the color fidelity on it?
Curious to know what you guys think
+ Reply to Thread
Results 1 to 3 of 3
This is a complex topic.
Your analysis has many issues with the methodology and interpretation. There are many variables you have not controlled for that may lead you to draw the wrong conclusions. Basically, all your "problems" stem from YCbCr <=> RGB conversions and subsampling . You should read up on "chroma subsampling", and YCbCr (also called "YUV") vs. RGB color model / color space issues.
Some of your conversions and screenshots are incorrectly displayed (shifting of colors due to non color managed Rec709 vs. Rec601 matrix), some of them are improperly done (vegas's studio RGB vs. computer RGB , resulting in different levels)
You're using the 4:2:0 subsampled YCbCr for h264. That is the main cause of the lines when testing for differences. 4:2:0 means the color information is reduced in 1/2 in each dimension. So for a 1920x1080 source image, it only contains 960x540 of color information . It is possible to encode RGB (same as source , unsubsampled, full color) with x264. When encoding RGB (in x264 it's actually stored as YCbCr with a GBR matrix) there is no subsampling, so no line shifting, however the tradeoff is more bandwith, less compatibility (the RGB variant isn't accepted by many hardware and software, only the lesser 4:2:0 variant)
When you take a screenshot (image) , the YCbCr data is converted to RGB . The method and algorithm used of that conversion determines what you see, and thus your interpretation . That 960x540 is essentially enlarged back to 1920x1080, so called chroma upscaling back to RGB . Of course no scaling is the best, but that reason behind chroma subsampling in the 1st place is bandwidth reduction. As you know, there are different algorithms of scaling e.g. nearest neighbor , lanczos, bicubic, bilinear, spline36, dozens more..., and many different choices for chroma interpolation (MPEG1 vs. MPEG2, DV etc...) . All these decisions may shift the lines up or down, left or right. They may appear to blur the lines (chroma smoothing), or there may be chroma aliasing (jaggies) from being to "sharp" . It's even more complicated - There are 2 sets of decisions, first for the chroma downsampling from the source RGB to YCbCr 4:2:0 (which algorithm, matrix are used) , and second, for the chroma upscaling and matrix in your display device or software used for checking (e.g. after effects) . AME will use the same algorithm as AE for up and downsampling chroma - so it's no wonder that it might appear "better" . But you will see in real world testing, with real world content (not a static test chart) with other criteria like noise, compression artifacts, subjective image quality, compression ratio, speed - it falls somewhere in the upper middle of the pack
As an aside, your bitrate's vary wildly from ~1-10x difference. Just because you enter "x" bitrate doesn't mean it's always achieved by encoder "Y". Now that's not a huge limiting factor for the things mentioned on static content in this particular test. But for any other codec/video testing will seriously invalidate the testing.
In short, you have to convert to YUV with the matrix you want (usually rec.601 for SD, rec.709 for HD) then make sure the player converts back to RGB with the correct matrix. The best way to do the latter is to flag the matrix when you encode. For example, with the x264 command line encoder:
As poisondeathray pointed out, there will still be errors due to chroma subsampling, and small RGB/YUV/RGB rounding errors.