VideoHelp Forum
+ Reply to Thread
Results 1 to 15 of 15
Thread
  1. Member
    Join Date: Jan 2012
    Location: Budapest
    Search Comp PM
    Hello !

    I used potplayer as video player. See the difference of colors. Why does OPENGL looks better?

    Watch the two uploaded png images:

    OPENGL:
    Click image for larger version

Name:	OPENGL.png
Views:	78
Size:	5.17 MB
ID:	27020

    EVR:
    Click image for larger version

Name:	EVR.png
Views:	68
Size:	5.32 MB
ID:	27021


    Thank you for your reply!
    Quote Quote  
  2. Member Cornucopia's Avatar
    Join Date: Oct 2001
    Location: Deep in the Heart of Texas
    Search Comp PM
    Assuming all other things in the render are equal (which I truthfully cannot do, not knowing fully the render/playback chain), it looks to be a matter of the difference between OPENGL using the full RGB 0-255 contrast levels vs. EVR using a restricted contrast level. Neither is necessarily WRONG, depending on other factors (display device used, etc). SHOULD it be doing this? - Can't say without knowing further your setup.

    But most people, without other cues as to differences in setup, would say that the higher contrast one is the better one.

    Notice the histograms:

    OPENGL - Name:  OPENGL_Histogram.gif
Views: 151
Size:  3.0 KB

    EVR - Name:  EVR_Histogram.gif
Views: 153
Size:  3.1 KB

    Scott
    "When will the rhetorical questions end?!" - George Carlin
    Quote Quote  
  3. The OGL image has blown out brights and other shots probably have crushed blacks. Also, one of the images is being converted with the wrong colors (rec.601 vs. rec.709). I suspect the EVR image is the correctly displayed one. You'd have to upload a short segment of the source to say for sure. Is it MJPEG?
    Quote Quote  
  4. The difference simply comes from a different level. Enabling the following option will make no difference in EVR.

    Name:  2014-08-24_7-57-34.jpg
Views: 151
Size:  128.1 KB


    It can be done using a pixel shader or a built-in video processing filter too.
    Quote Quote  
  5. Originally Posted by sheppaul View Post
    The difference simply comes from a different level.
    No, there's also the 601/709 color decoding problem. One of them is wrong.
    Quote Quote  
  6. I tested it. OpenGL renderer seems to be broken.
    Quote Quote  
  7. If in doubt, try MadVR (Madshi Video Renderer). I don't know if it comes with Potplayer or you need to download it yourself. There's always places where things can go wrong, but 99.9% of the time MadVR will be displaying video correctly with it's default settings. Resetting the player too might make it even more likely MadVR is getting it right. Anything which displays the video differently, probably isn't.
    Quote Quote  
  8. Member
    Join Date: Jan 2012
    Location: Budapest
    Search Comp PM
    Originally Posted by hello_hello View Post
    If in doubt, try MadVR (Madshi Video Renderer). I don't know if it comes with Potplayer or you need to download it yourself. There's always places where things can go wrong, but 99.9% of the time MadVR will be displaying video correctly with it's default settings. Resetting the player too might make it even more likely MadVR is getting it right. Anything which displays the video differently, probably isn't.
    However the MadVR has no 64bit version.
    Quote Quote  
  9. Why do you need 64 bits in a media player?
    Quote Quote  
  10. Originally Posted by Stears555 View Post
    However the MadVR has no 64bit version.
    Well..... there's nothing quite like viewing video with incorrect levels and the wrong colorimetry as long as the player is 64 bit.

    You could download the portable, 32 bit version of MPC-HC and run MadVR with it. At least then you can check whether the player/renderer you're running displays the same video the same way.

    I've never grown to like PotPlayer all that much. It appears to be built on MPC-HC code and to me it's like MPC-HC with a ton of extra options I don't really need. And I can never find my way around it's settings. They seem to be all over the place. Maybe I just need to spend more time with it.

    For the record, the Windows renderers sometimes get the colorimetry wrong. Or it's the Nvidia drivers, but I think it's the renderer (I'm not sure where the conversion to RGB actually happens). They use an odd formula for deciding whether to use HD or SD colorimetry. The upshot is cropped 720p video with resolutions such as 1280x524 will display incorrectly (standard definition colorimetry is used). If memory serves, the Windows renderers switch to HD colorimetry when the height is 578 or greater (don't quote me on that, I'd need to check). I have no idea what OpenGL does.

    Potplayer may include some cleverness to fix the problem (I don't know), such as reading any colorimetry info written to the h264 video stream. It does have a section in preferences for "colorspaces" but I've no idea how well it works or what the "auto selection" setting does.
    MadVR does use colorimetry info written to the h264 video stream and it also uses a better formula for deciding on colorimetry when it's not specified, which means it's likely to get it right (I think it uses the same formula as ffdshow does when converting to RGB).

    Your video card can potentially mess with things. It might pay to disable any image enhancing stuff to begin with. In my case I leave the player set to output the video "as-is" and I get the video card to expand the video levels to PC levels on playback. That way the levels are right for a PC monitor and they should be regardless of the player/renderer. If the player/render is set to output full-range (PC) levels the video card won't expand them a second time.
    Last edited by hello_hello; 26th Aug 2014 at 13:42.
    Quote Quote  
  11. Member Cornucopia's Avatar
    Join Date: Oct 2001
    Location: Deep in the Heart of Texas
    Search Comp PM
    @hello_hello, it would make sense for anything above 576 vertical to be considered HD and anything equal & below to be considered SD, since that is the Vertical resolution of PAL specs (which has the highest known resolution of SD world). H rez in broadcast might vary (greatly?), but V rez has to be fixed to standards (otherwise interlace would get screwed up), so going by the V rez is the obvious choice.

    Scott
    "When will the rhetorical questions end?!" - George Carlin
    Quote Quote  
  12. I think it'd be more clever to also take the width into consideration. "Anything above 576 vertical" only works for standard resolutions/aspect ratios and wouldn't take any cropping into account, but being a PC....

    That's the way ffdshow does it when converting to RGB (according to it's tooltip). If the width is greater than 1024 or the height is greater than 600 it uses Rec.709. So it'd use Rec.709 for video with resolutions such as 1280x534.
    If I remember correctly, the Windows renderers (WMR9 and EVR at least) use the width and height too, but the formula is different. Rec.709 is used if the width is 1200 or more and the height is 578 or more. So 1280x534 is standard definition.

    I checked the above quite a while ago by playing a video with MPC-HC maximised so it filled the screen. I resized with ffdshow and watched for the point where the colours changed. That's the way I remember it and I'm pretty sure it's correct.
    Quote Quote  
  13. Member Cornucopia's Avatar
    Join Date: Oct 2001
    Location: Deep in the Heart of Texas
    Search Comp PM
    Cropping & re-encoding is a PC-centric, non-broadcast, non-professional way of looking at it.

    Example: a 2.39:1 AR in an active picture area of a superwidescreen 720p movie is 1280 x 720 with letterboxing, and 1280 x 535 without (cropped). So according to what you said, ffdshow would consider BOTH versions as HD while WMR9 & EVR wouldn't (former = HD, latter = SD). Sounds like ffdshow would be more correct...

    But,
    a cropped 720p movie would NEVER exist in proper broadcast/pro circles, and WMR9 & EVR know that.
    The whole cropping thing is something consumers do. It's an imperfect solution to a non-existent problem that creates its own set of problems. Much like deinterlacing 60i material even though it's still going to DVD. Cropping doesn't save bitrate because letterboxed black bars' affect is negligible. It doesn't prevent you from seeing black bars on playback because they will be generated again on the fly in order to display correctly on 16:9 or 4:3 screens. All it does is make the consumer - who doesn't understand how these things work - more comfortable.

    So you could say WMR9 and EVR are following the more "professional" way of looking at things (at least in that context).

    Scott
    "When will the rhetorical questions end?!" - George Carlin
    Quote Quote  
  14. Originally Posted by Cornucopia View Post
    So you could say WMR9 and EVR are following the more "professional" way of looking at things (at least in that context).
    I tried, but I couldn't get the words out.....

    ffdshow and VMR9/EVR get it correct when using "industry standard" resolutions and aspect ratios. The fact VMR9/EVR get it wrong on other occasions doesn't make them any more correct or professional, just correct less often.

    But,
    a cropped 720p movie would NEVER exist in proper broadcast/pro circles, and WMR9 & EVR know that.
    Microsoft's short-sitedness doesn't make them any less wrong either.

    I'm fairly sure you'll find iTunes video is mostly cropped.
    AVIs/MKVs aren't industry standard. Resolutions such as 640x480 or 720x400 aren't industry standard. Most of us play that type of video on a daily basis. Most of us crop. We do much of that using a PC.
    Lots of people play video using a method which doesn't overscan. Not being professionals, I guess we can predict how much of our encoded video will appear on the screen.

    It doesn't prevent you from seeing black bars on playback because they will be generated again on the fly in order to display correctly on 16:9 or 4:3 screens.
    And when the screen aspect ratio isn't exactly 4:3 or 16:9?
    I've still got some widescreen video which was originally encoded from 4:3 DVDs. I'm glad I cropped it when I re-encoded it. If ever I own a TV with a wider than 16:9 aspect ratio I'll be happy I cropped all my 16:9 video while encoding, or I'll have to invest in some nice curtains for the sides of the TV screen and look at black bars top and bottom.

    Anyway.... we could debate the pros and cons of cropping all day, but it won't make VMR9/EVR right more of the time.
    I've been meaning to test my Bluray player and TV's media player for quite a while, just to see if I can work out what they do. They could stick to a single colorimetry all the time for all I know. I mainly use my PC as my media player so I've never gotten around to it, but given they both happily play non-professional, non-industry standard video, it'd be nice if they were a little clever about it.
    Quote Quote  
  15. Member
    Join Date: Nov 2013
    Location: Serbia
    Search Comp PM
    WMR9/EVR comes with Windows OS which goes on PC and PC is not primary used for watching broadcasts. So, no justified reason to limit those renders to work correctly only with standardized resolutions. Problem is, Microsoft does not want to waste time researching what people use and just goes with standardized solutions.

    On the other hand, madshi and other player and decoder developers listen for user feedback and make changes to do things right.

    P.S. I use Intel IGPU on Pentium G3220 and noticed that, even if I set levels in control panel on Application controled, levels still depend on previously choosen manual settings. I noticed this with EVR-CP and videos on Youtube using Firefox. Probably a bug but still affects output.

    About encoding black bars. Encoding original black bars does not wastes a lot of bits, it is true. But, re-encoding Blu-Ray or DVD-Video with black bars does because those black bars contain compression artifacts which wastes a lot of bits. So, one would need to crop and add black bars again or just crop. I prefer to just crop.

    As hello_hello mentioned, standard Blu-Ray aspect ratio is 16:9 but most of the time contains video with wider aspect ration and, thus have black bars. Removing black bars won´t change much if you watch move on 16:9 TV but if you watch that movie on TV with AR wider then 16:9 you will aslo get black bars on left/right side.
    Quote Quote