VideoHelp Forum




+ Reply to Thread
Results 1 to 22 of 22
  1. Member
    Join Date
    Feb 2013
    Location
    France
    Search PM
    Hello,

    I usually find my answers on Google but I'm really stuck with this problem. Whether I use Vegas (x264vfw), Xmedia Recode, MediaCoder or even MeGUI, I can't properly encode my videos in x264. There is a little color shift. Here is a screenshot comparison. You can notice that the reds tend to be oranger. Yet, my x264 settings are pretty basic :

    Code:
    --profile high --bitrate 6000 --preset slow --ref 5 --subme 7 --analyse all --no-fast-pskip
    What am I doing wrong? When I encode my videos with the H264 of Adobe Media Encoder, the colors stay exactly the same. Unfortunately, we can't really tweak the parameters in AME and that's why I try to use x264 now.

    You can also download my 2 videos here (x264 MeGUI + h264 AME).
    Last edited by Ymerej; 19th Feb 2013 at 05:50.
    Quote Quote  
  2. probably tv vs. pc scale
    Quote Quote  
  3. Member
    Join Date
    Feb 2013
    Location
    France
    Search PM
    Originally Posted by Selur View Post
    probably tv vs. pc scale
    It isn't, unfortunately. On PC scale, the image is darker/more contrasted but the colors remain unbalanced.
    Quote Quote  
  4. It's not x264's fault. It's the difference between rec.601 vs rec.709 YUV/RGB conversion.

    https://forum.videohelp.com/threads/329866-incorrect-collor-display-in-video-playback?p...=1#post2045830

    My guess is the source is RGB, and AME used a rec.709 matrix to convert to YV12 whereas MeGUI used a rec.601 matrix. Force MeGUI to use rec.709 and they'll look alike. ConvertToYV12(matrix="rec709"). Or you can use --colorprim in x264 to flag the video as rec.601. But some players may not pay attention to that.

    By the way, the screen caps show a different problem (PC vs rec scaling) than the video samples.

    <edit>
    Weird, when I look at the OP's screenshopcomparison.com page with Internet Explorer the difference is obviously rec.601 vs rec.709. But when I use Firefox (which is what I used earlier) the difference looks more like rec.709 vs PC.709.
    </edit>
    Last edited by jagabo; 19th Feb 2013 at 12:44.
    Quote Quote  
  5. -> colorMatrix
    Quote Quote  
  6. Banned
    Join Date
    Oct 2004
    Location
    New York, US
    Search Comp PM
    One is encoded as PAL (?). Both images are too orange (same deficit of blue in both samples), the main differences are in the red and green levels. Both have severe clipping.

    This is the first time I've seen 30FPS for PAL. I guess it can be done. I never tried.
    Last edited by sanlyn; 25th Mar 2014 at 11:32.
    Quote Quote  
  7. It's definitely a R.601 vs R.709 thing. I'm still using an older version of the MPC-HC "BT.601 -> BT.709" pixel shader as it works for HD video. If I enable it for the AdobeMediaCoder sample the two videos look exactly the same. So if the video is already YV12 before it gets to MeGUI then I assume this would also fix it:

    ColorMatrix(mode="Rec.601->Rec.709", clamp=0)

    Or would you do it the other way around? It always confuses me. Anyway obviously you'd need to load the colormatrix plugin via the MeGUI script. I'm pretty sure MeGUI keeps it in it's own AVISynth plugins folder.
    Last edited by hello_hello; 19th Feb 2013 at 08:59.
    Quote Quote  
  8. Member
    Join Date
    Feb 2013
    Location
    France
    Search PM
    Thanks for all the inputs everybody. When I get home, I will definitely try what jagabo and hello_hello have suggested. And furthermore, I just realized that I might have misconfigured something with the Lagarith format (the source of the samples).
    Last edited by Ymerej; 19th Feb 2013 at 09:44.
    Quote Quote  
  9. Originally Posted by hello_hello View Post
    So if the video is already YV12 before it gets to MeGUI then I assume this would also fix it:

    ColorMatrix(mode="Rec.601->Rec.709", clamp=0)
    Yes.

    The two videos look the same after this:

    Code:
    v1=ffVideoSource("Adobe.mkv") 
    v2=ffVideoSource("Megui.mkv").ColorMatrix(mode="Rec.601->Rec.709") 
    Interleave(v1,v2)
    Aside from minor encoder differences, obviously.
    Quote Quote  
  10. IIRC , megui does something stupid with the script creator - it adds colormatrix automatically. It's back from the days when people erroneously thought DVD's used Rec709, but was never removed

    Don't use the script creator, or edit the script to remove that line

    And furthermore, I just realized that I might have misconfigured something with the Lagarith format (the source of the samples).
    If it's lagarith YUY2 or YV12, just remove the colormatrix line when using megui . (input = output) . If you are resizing for SD, the colormatrix line might be appropriate

    If it's lagarith RGB, use ConvertToYV12(matrix="Rec709")
    Quote Quote  
  11. I forgot about the MeGUI colormatrix thing. There's a checkbox to disable it in the script creator. It only applies when opening mpeg2 video though (or maybe video which DGIndex has indexed), so it shouldn't be causing the problem here. In fact I think there's only one instance of converting mpeg2 video where it'd do the wrong thing, which would be when re-encoding HD R.709 mpeg2 to another HD format.

    I suggested to the MeGUI developer to remove it a while back, but for some reason he hasn't.
    Quote Quote  
  12. Originally Posted by hello_hello View Post
    I forgot about the MeGUI colormatrix thing. There's a checkbox to disable it in the script creator. It only applies when opening mpeg2 video though (or maybe video which DGIndex has indexed), so it shouldn't be causing the problem here. In fact I think there's only one instance of converting mpeg2 video where it'd do the wrong thing, which would be when re-encoding HD R.709 mpeg2 to another HD format.

    I suggested to the MeGUI developer to remove it a while back, but for some reason he hasn't.
    Why don't you think it's happening here? jagabo said 601=>709 makes it almost even

    If it's enabled, it does the wrong thing for SD DVD source, wrong thing for HD source. The default setting is 709=>601. Again, this is many years ago when people thought studios used Rec709 for the RGB master=> YV12 conversion for DVD's.

    It's the right thing from an HD to SD conversion . Wrong thing for everything else .

    It shouldn't even be an option in MeGUI . People that know what to do, won't rely on it, and I can't count the number of times this explains "screwed up colors" by various posters . Definitely should be removed
    Quote Quote  
  13. The OP said he uses lagarith but doesn't say how his source was compressed with Lagarith. Did Lagarith convert from RGB to YUV? Did the software that called Lagarith for compression do the conversion? Was Lagarith set to convert from YUV to RGB on output? There are too many unknowns to say exactly where the problem lies.
    Quote Quote  
  14. I'm pretty sure the way MeGUI adds the colorimetry stuff to a script, the colormatrix plugin takes the colorimetry as reported by DGDecode to decide whether to convert the colors or not. DGDecode hasn't always done so, but if there's no colorimetry info in the mpeg2 video stream, it assumes R.601. The colormatrix line MeGUI adds to the script is this one, and it only adds it when DGDecode is indexing, and I'm pretty sure only for mpeg2 video. For all other video formats or when indexing with something else, it's not added.

    ColorMatrix(hints=true, threads=0)

    So.... standard definition mpeg2 video is pretty much always R.601. If there's no colorimetry info, DGDecode assumes R.601 and colormatrix therefore doesn't convert anything. If it happens to be R.709, then it'll convert it to R.601.
    Even though it's fairly pointless, it's also fairly harmless when converting standard definition video. It pretty much can't go wrong. It's when converting HD mpeg2 that things mightn't work as they should. As the same rules still apply it'll convert to R.601 if the source is R.709, which if you happen to be resizing from HD to SD is exactly what you'd want it to do. If you're not resizing to SD though, that's when it might convert from R.709 to R.601 incorrectly. Once again, if a HD mpeg2 source is R.601 or there's no colorimetry info, R.601 is assumed so the worst it'll do is not convert anything as though it wasn't there.

    That's my understanding of how at works, at least......
    Quote Quote  
  15. Originally Posted by poisondeathray View Post
    It shouldn't even be an option in MeGUI . People that know what to do, won't rely on it, and I can't count the number of times this explains "screwed up colors" by various posters . Definitely should be removed
    To be honest, I think most of the time that'd be the renderer/decoder's fault for displaying video incorrectly rather than it having been converted incorrectly. At least when it comes to converting DVDs etc. I recall a couple of threads quite a while ago where posters were having issues with luminance levels. The DVD would display correctly while the encoded version would look washed out because the levels weren't being expanded. I kind of remember MadVR being guilty of that a long time ago, or maybe the video card drivers were guilty there. It's no doubt very out of date, but there's a page in the AVISynth wiki dedicated to the subject, which suggests different video card, renderer, resolution combinations which can result in incorrect playback. http://avisynth.org/mediawiki/Luminance_levels
    Maybe it'd also explain why so many people though DVDs originally used R.709 when they don't.

    Even today, on my XP PC the colorimetry isn't always correct. That drove me nutty for quite a while till I worked out what was going on. When I tested it, the Windows renderers switch colorimetry if the video width is greater than 1200 AND the height is greater than 587.... well that's the way I remember it but it won't be too far off. So encode a 720p video at 1280x720 and Windows will display it using R.709. Re-encode the same video again while removing the black borders so you're left with 1280x544 and it'll display using R.601. WMR9 or EVR, they're both the same. MadVR seems to make more logical choices, probably similar to the way ffdshow does it when converting to RGB, so it usually gets it right.
    Quote Quote  
  16. Edit: I'm editing after pdr's response just below. I wrote this post contradicting you, hello_hello, and then spent 20 minutes reading the DGIndex and ColorMatrix docs and making scripts. And I decided you were right after all - that ColorMatrix as used in MeGUI doesn't really do anything wrong. So I removed what I wrote. And I agree with pdr below that the ColorMatrix line isn't needed in the MeGUI script.
    Last edited by manono; 19th Feb 2013 at 13:28. Reason: Edit:
    Quote Quote  
  17. Originally Posted by hello_hello View Post
    Originally Posted by poisondeathray View Post
    It shouldn't even be an option in MeGUI . People that know what to do, won't rely on it, and I can't count the number of times this explains "screwed up colors" by various posters . Definitely should be removed
    To be honest, I think most of the time that'd be the renderer/decoder's fault for displaying video incorrectly rather than it having been converted incorrectly.
    I agree that is more common, same with TV/PC levels issues and graphic card settings . This can be seen in the screenshots.

    But so many people inadverntly leave that colormatrix line in, when it shouldn't be in there. It does more harm than good. Even if it used hints in the manner you suggested, not having colormatrix would screw up fewer things. There is no potential benefit or upside of including it, only possible harm.

    The problem with encoding it that way - it "damages" the video. If it's only a display issue, at least the encode is correct, just playback configuration isn't setup properly which can be easily fixed instead of having to redo an encode



    jagabo is right , however - there are so many other potential workflow issues, that the OP must clarify exactly what the sources were, and what steps were taken.
    Quote Quote  
  18. Member
    Join Date
    Feb 2013
    Location
    France
    Search PM
    The conversation is very interesting but a little bit too technical for me. As I thought, I didn't check the Lagarith settings, I went too fast. To clarify, the steps from the source to the final sample was :
    AVC YUV 4:2:0 (from DSLR) > Lagarith lossless RGB (Vegas) > x264 (MeGUI)

    So I added the ColorMatrix script and the colors are now right. Thanks a lot for the tip Now in Lagarith, I can set the output in YUY2 or YV12 instead of the default RGB. My knowledge is limited, what would you suggest to set in order to limit the image degradation?
    Last edited by Ymerej; 19th Feb 2013 at 13:48.
    Quote Quote  
  19. Originally Posted by manono View Post
    Edit: I'm editing after pdr's response just below. I wrote this post contradicting you, hello_hello, and then spent 20 minutes reading the DGIndex and ColorMatrix docs and making scripts. And I decided you were right after all - that ColorMatrix as used in MeGUI doesn't really do anything wrong. So I removed what I wrote. And I agree with pdr below that the ColorMatrix line isn't needed in the MeGUI script.
    Damn...... and I missed it all!
    Quote Quote  
  20. Originally Posted by Ymerej View Post
    Now in Lagarith, I can set the output in YUY2 or YV12 instead of the default RGB. My knowledge is limited, what would you suggest to set in order to limit the image degradation?
    It depends how you have vegas setup and what you are doing .

    Generally, if you are working in RGB (e.g if you used color filters, grading) , it's best to output RGB.

    Colorspace conversions are lossy. You don't want to do too many unnecessary conversions

    There are a lot of "gotchas" when using vegas, because it doesn't use standard range RGB for importing footage when in 8bit mode , it uses "Studio RGB". (but in 32bit full range it does). It might be you have to add some other things to the script
    Quote Quote  
  21. Originally Posted by hello_hello View Post
    Damn...... and I missed it all!
    Hehe, one doesn't like to look like a dunce or have to eat his words later but I was going by something said in the ColorMatrix doc:

    source:
    • Allows setting the source format in integer form. If the mode parameter is specified, then it will override source. Possible settings:
      • 0 - Rec.709
      • 1 - FCC
      • 2 - Rec.601
      • 3 - SMPTE 240M
      The source and dest parameters cannot be equal if inputFR and outputFR are also equal.
      default - 0 (int)


      And I figured that in the absence of anything definitive it chooses Rec.709. Well, if there's no information about the colorimetry it should still choose Rec.601 by default, but the hints coming from more recent versions of DGDecode prevent it from screwing up. And that's why I erased what I had written.
    Quote Quote  
  22. Yeah the info on how the hints settings works clears it up:

    "If ColorMatrix is unable to detect hints in the stream (for example because you are using an incorrect dgdecode version) it will output an error. When using hints=true, the conversion specified by mode (or by source and dest if mode is not set) will only be applied when the colorimetry of the input video does not match the destination colorimetry. For example, if mode is set to "Rec.709->Rec.601" (or mode is left blank and source=0 and dest=2), then the conversion will only be applied if the input video's colorimetry is not already "Rec.601".

    So not only does the hints parameter effectively over-ride the default mode settings, it causes any conversion you specify via the mode parameters to only take place if it's necessary.

    What I don't understand, is I'm pretty sure AutoGK has it's color correction option enabled by default, and I'm pretty sure it attempts to correct the colors the same way using colormatrix, yet for some reason AutoGK never seems to be blamed for "screwed up colors threads" the way it appears MeGUI is. Maybe that's because AutoGK hides it's scripts away in a temp folder a little more than MeGUI does, which means they can't mess up the colors the same way.

    Mind you I agree completely. The color correcting thing using colormatrix is basically just a waste of time. I disabled it in both AutoGK and MeGUI myself, and when re-encoding HD mpeg2 video it could convert the colors when it shouldn't, but that's about the only time it can. I suggested to Zathor over at doom9 that maybe the color correction option could be replaced with something a little more clever.... such as an option to tell MeGUI you're converting HD video to SD which would get MeGUI to add the correct color conversion to the script, which would work also for any type of video, but I'm not sure if anything will come of it. As far as I know there's not a single GUI on the planet which makes it easy to convert the colors correctly when necessary, and I thought it'd be kind of cool if MeGUI could fire the first shot in the war against converting HD video to SD with the wrong colorimetry. What were the "scene encoders" thinking when they decided SD encodes should be R.709 even if they write R.709 colorimetry to the video stream? I can't say I've bothered testing any hardware players to see if any do anything but completely ignore the colorimetry info, but most PC players/renderers seem oblivious to it.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!