I am struggling with a problem and I am not able to solve it so far. I really hope that someone here has the solution.
I'm working on a small project of digitizing VHS tapes. The tapes have already been captured using a simple EasyCap converter. I have captured the video's using VirtualDub as uncompressed avi files, 720x576 in YUY2 colorspace.
Next, I use an AviSynth script to perform some denoising and very modest color correction. At the end of the script I convert to YV12 colorspace using ConvertToYV12(interlaced=true). Now, some video's I would like to keep as computer file and encode with x264, but for a few video's I also want to make a video DVD so that family members can watch them conveniently on their TV.
For the DVD, I encode the video using HC encoder (pretty much standard settings). To speed up the encoding process (the denoising is slow), I first run the denoising AviSynth script and save as a new uncompressed avi, which is now in YV12 colorspace. Then I serve the new AVI file to HC encoder via a minimal AviSynth script. From what I understand, HC encoder expects YV12 colorspace so I should be all ok. Encoding proceeds without any issue and when I play the m2v files on my PC, they look just like the AVI file. I can see in VLC that the m2v files are in 4:2:0 YUV, to my understanding essentially equal to YV12. I then create the VOB & IFO files with MuxMan and burn to DVD using Nero Burning ROM. I checked and the VOB file is still in 4:2:0 YUV and looks just like the AVI file when I play it in VLC.
Now comes the issue: when I play the burned DVD on my TV (LCD Android smart TV), the colors are too saturated, much more saturated compared to when I evaluated them on my PC. Green grass almost looks fluorescent and a blue T-shirt looks unnaturally bright. Thinking that this may be due to an offset between PC screen and TV screen, I decided to evaluate a few clips using the video-players on the TV itself. It's an Android smart TV and I use both Kodi and VLC apps. The strange thing is: the m2v files again show too saturated colors. However, when I look at the original AVI file on the TV, it looks OK!. Both Kodi and VLC gave the same results. When I test these files on my PC, they look indistinguishable. On the PC I use both VLC and MPC-HC.
So the AVI file is in 4:2:0 YV12, and the encoded m2v file is in 4:2:0 YUV. As far as I understand these are essentially equivalent, so that should not be the issue. When I look at the histogram of the original AVI, the luminance seems to be limited to the range of 16-235 which would indicate that the clip uses TV levels and not PC levels. And since it is in SD (720x576), that means that it uses Rec.601 levels (correct?).
Since the clip is captured in YUY2, converted to YV12 by AviSynth and encoded as YV12, there should be no RGB conversion involved. So I therefore was under the impression that I did not have to deal with any colorspace issue (Rec601 vs Rec709). Still, for testing sake, I also tried encoding 2 different versions using either:
ConvertToYV12(matrix = "Rec601", interlaced = true) or ConvertToYV12(matrix = "PC.601", interlaced = true). The result is the same: on the PC the encoded m2v files look really identical to the AVI file. But viewing them on the TV: the m2v files have again too saturated colors, while the AVI file looks ok.
Summary: AVI file and HC encoded m2v file appear identical on PC, but when viewed on a TV there is a difference: the m2v file has too saturated colors. How is this possible if the colorspace is the same, and there is no color conversion involved? Should I simply accept this and separately optimize colors for DVD, or is there something that I'm missing?
+ Reply to Thread
Results 1 to 9 of 9
I have tried different options on the Settings 2 tab of HC encoder. I have tried all options for colorimetry: default, BT.709, BT.470-2M, BT.470-2BG, SMPTE170M and SMPTE240M. The result is the same: on my PC the encoded m2v files look identical to the original avi, but when I look at the files on my TV, the encoded m2v files have too saturated colors, while the avi looks ok. I cannot see any difference between the colorimetry settings.
I also checked both the original avi as the encoded m2v (with default colorimetry) with MediaInfo. I cannot see anything unusual.
Original uncompressed avi
Format : YUV Codec ID : YV12 Codec ID/Info : ATI YVU12 4:2:0 Planar Duration : 30s 40ms Bit rate : 124 Mbps Width : 720 pixels Height : 576 pixels Display aspect ratio : 5:4 Frame rate : 25.000 fps Standard : PAL Color space : YUV Chroma subsampling : 4:2:0 Compression mode : Lossless Bits/(Pixel*Frame) : 12.000 Stream size : 446 MiB (98%)
Format : MPEG Video Format version : Version 2 Format profile : Main@Main Format settings, BVOP : Yes Format settings, Matrix : Custom Format settings, GOP : Variable Format settings, picture structure : Frame Duration : 16mn 39s Bit rate mode : Variable Bit rate : 6 002 Kbps Maximum bit rate : 8 000 Kbps Width : 720 pixels Height : 576 pixels Display aspect ratio : 4:3 Frame rate : 25.000 fps Standard : PAL Color space : YUV Chroma subsampling : 4:2:0 Bit depth : 8 bits Scan type : Interlaced Scan order : Bottom Field First Compression mode : Lossy Bits/(Pixel*Frame) : 0.579 Time code of first frame : 00:00:00:00 Stream size : 715 MiB (100%)
Is the TV activating different settings based on the different file types?
Eureka! I really should have been able to figure this out by myself...
I decided to dive into the settings of Kodi on my TV and found settings related to hardware acceleration for decoding (MediaCodec). After I disabled hardware acceleration, the m2v files show identical colors as the avi file. So apparently, using hardware acceleration for decoding gives slightly different colors than software decoding and hardware decoding is not used for the (uncompressed) avi file. I found the same when I disabled hardware acceleration in VLC on the TV. So, mystery solved!
I'm still a bit surprised that different decoding routes can give slightly different colors, but at least now I know where the difference comes from. I guess that I have to give the video's a slightly different color treatment if it is intended to be played on hardware (DVD player or TV).
You should be able to set the video processing amplifier on the computer, TV, or media player to avoid the color/levels differences. There shouldn't be such differences on a properly set up system. If you modify your encodings to adjust for the problem on your current setup you will be making non-standard DVDs that will have the wrong colors/levels on other systems.
Indeed that's true. However I don't think there's any way for a user to adjust the settings related to hardware acceleration on a smart TV. So I guess it's an issue that I will have to live with. When I search online it seems that it's a relative common issue that people see a color difference when they switch hardware acceleration on or off...
Why do you exclude as a solution to just turn it off?
I prefer not to disable hardware acceleration, because without it video playback is not entirely smooth. Besides, I don't know objectively if the colors are slightly off with hardware acceleration, or without. I can just tell that there slightly different.