VideoHelp Forum




+ Reply to Thread
Results 1 to 20 of 20
  1. Chicken McNewblet
    Join Date
    Sep 2009
    Location
    United States
    Search Comp PM
    Don't have much substance to this question, just something that's puzzling the crap out of me. In almost all applications I've read about (which admittedly may not be that much), YPbPr or YCbCr are always derived from an RGB source at some point. So especially when it comes to analog, why did we end up with component video instead of literally red, green, and blue signals traveling down the red, green, and blue cables? Why do we do the extra conversion step, instead of simply making our common viewing equipment standardized to RGB?
    Quote Quote  
  2. Originally Posted by CursedLemon View Post
    Don't have much substance to this question, just something that's puzzling the crap out of me. In almost all applications I've read about (which admittedly may not be that much), YPbPr or YCbCr are always derived from an RGB source at some point. So especially when it comes to analog, why did we end up with component video instead of literally red, green, and blue signals traveling down the red, green, and blue cables? Why do we do the extra conversion step, instead of simply making our common viewing equipment standardized to RGB?
    All computer equipment is RGB, from some strange reason USA promoted YPbPr as consumer video component format.
    Some people (videophiles?) support this as according to them YPbPr offer higher quality than RGB.
    And nowadays it is too late for such dispute as analog video (also component) died replaced by HDMI and similar digital interfaces.
    So this discussion may have place 15 years ago but now it is too late.
    Quote Quote  
  3. I think it's because of historical reasons. RGB requires three signals (obviously) but when encoded to a composite signal, the 'Y' signal is created by summing RG and B then 'G' is disposed of. 'Y' is the luiminance information which generally needs higher resolution because our eyes have far sharper impression of brightness than color and therefore needs more bandwidth to process it. It is also the only signal needed by monochrome monitors. For recording purposes, the color information can be stored at lower resolution (= easier) and the 'G' restored at final playback as G = Y-(R+B).

    When public demand was for higher resolution, it was more practical to continue using the luminance and color difference signals as they already existed and were well catered for in consumer devices. They gave it the name "S-Video". With almost everything being digital these days, the need for difference signals is less important but still used in legacy applications.

    I omitted a factor called 'weighting' in the calculations for simplicity, in real life the individual color signals are not summed or restored equally.

    Brian.
    Quote Quote  
  4. Putting it another way:
    Cameras use RGB sensors, and displays use RGB pixels, but in between, [almost] all video processing is done in YUV.
    https://en.wikipedia.org/wiki/YUV
    YUV is a color space typically used as part of a color image pipeline. It encodes a color image or video taking human perception into account, allowing reduced bandwidth for chrominance components... The scope of the terms Y′UV, YUV, YCbCr, YPbPr, etc., is sometimes ambiguous and overlapping..
    YPbPr cabling (instead of RGB) would eliminate unnecessary YUV->RGB->YUV conversions, which degrade the video slightly.
    Quote Quote  
  5. Originally Posted by raffriff42 View Post
    Putting it another way:
    Cameras use RGB sensors, and displays use RGB pixels, but in between, [almost] all video processing is done in YUV.
    https://en.wikipedia.org/wiki/YUV
    YUV is a color space typically used as part of a color image pipeline. It encodes a color image or video taking human perception into account, allowing reduced bandwidth for chrominance components... The scope of the terms Y′UV, YUV, YCbCr, YPbPr, etc., is sometimes ambiguous and overlapping..
    YPbPr cabling (instead of RGB) would eliminate unnecessary YUV->RGB->YUV conversions, which degrade the video slightly.
    How it can eliminate this conversion if source and display are RGB? Some processing is easier in RGB (graphics composition with Alpha).
    In theory thi is even more complicated as PbPr should have half of the Y bandwidth so you need design low pass filter accordingly but you need design such filters to have same group delay as Y low pass filter (and this is difficult) - without this you will introduce time difference between color and luma...
    In RGB everything is simplified as all 3 signals have same bandwidth.
    Quote Quote  
  6. >Some processing is easier in RGB (graphics composition with Alpha)
    Absolutely, that is why I said '[almost]'

    >How it can eliminate this conversion if source and display are RGB?
    I mean, the signals inside the boxes being connected (such as DVD players and VCRs) are YUV.

    The issues you talk about are real, but the efficiency of YUV processing more than makes up for the disadvantages.
    Quote Quote  
  7. When color TV was first broadcast engineers needed to make the signal compatible with black-and-white TVs. The solution was to convert the RGB signals to an intensity component (which BW Tvs could display) and two color signals (which BW TVs would ignore), hence YIQ and YUV (YPbPr, PCbCr) were born. The color signals needed to be multiplexed onto a lower frequency carrier so they were broadcast with a lower resolution. This was considered acceptable because the human eye has less resolution for colors. As video progressed engineers stuck with what they knew. For digital video it was considered useful to maintain lower resolution for the color channels as a form of data compression.
    Quote Quote  
  8. Originally Posted by raffriff42 View Post
    YPbPr cabling (instead of RGB) would eliminate unnecessary YUV->RGB->YUV conversions, which degrade the video slightly.
    I'm still running XP, so I don't know how much better/worse things are, but there's still info on Avisynth's site explaining how to fix colorimetry problems with old ATI cards
    Nvidia have traditionally defaulted to limited range RGB (even via a VGA out), and the setting to correct it doesn't always survive a reboot, and when you connect via DVI/HDMI, the drivers may or may not see the device as a TV, depending on the resolution, which may have been a reasonable system for 10 or 15 minutes, but then it became common for PC monitors to have TV resolutions, so it could be a flip of a coin. I'm not sure.
    To add to the fun, my TV has one HDMI input for specifically connecting a PC, and when the refresh rate and resolution is just right, it automatically goes into PC mode. However unlike the VGA input, when the HDMI/PC input is used you can still choose the input level. My TV will happily accept full or limited range YCbCr, but the Nvidia outputs limited range YCbCr only. Maybe the video card drivers are making a mess of it, but the colours of XP itself aren't quite right when the output is YCbCr.

    Nvidia drivers for Win7 and newer apparently have an additional layer of confusion. Prior to Win7 you could only adjust the levels for video, but now you can adjust them for both video and Windows independently, possibly depending on the whim of the drivers and whether they decide your display is a monitor or TV. I haven't kept up with how that works.

    And Samsung in their infinite wisdom, decided to call the luminance level setting "HDMI Black Level" and labelled the options "Low" and "Normal" which must give the Samsung techs something to laugh about at their Xmas parties, because "Low" tells the TV to expect limited range, and "normal" tells it to expect "full range". Is there a version of reality where "normal" as the full range input makes sense for a TV?

    Some of the above may have been a contributing factor in Benq to deciding the LCD monitor I purchased recently should default to expecting limited range levels. Maybe the video card is outputting YCbCr for that PC (it's running Linux at the moment and I haven't investigated), but it seems all upside down to me. Unless limited range is required to support "deep colour" or "xvYCC" or whatever the colorspace for attempting to make the picture quality of LCD as good as Plasma is called.

    In my opinion, dumping RGB as an output type for the PC would be a step in the right direction to clean up all those problems as then everything could be limited range. For 8 bit though, it might degrade the picture quality a little, so maybe it'll stay the way it is until 8 bit displays go the way of the Dodo.

    Obviously the limited to full range conversions aren't lossless, but how lossy is a conversion from RGB to YCbCr 4:4:4, if the PC output had to be YCbCr 4:4:4?
    Last edited by hello_hello; 16th Oct 2016 at 11:26.
    Quote Quote  
  9. Originally Posted by hello_hello View Post
    Obviously the limited to full range conversions aren't lossless, but how lossy is a conversion from RGB to YPbPr 4:4:4, if the PC output had to be YPbPr 4:4:4?
    For 8 bit limited range YUV (16-235) 16 million RGB (0-255) values map to only about 3 million unique YUV values. That is, on average, about six different RGB values map to a single YUV value.
    Quote Quote  
  10. @ hello_hello, the question was about analog cables. Your post seems kinda off topic.
    Quote Quote  
  11. Originally Posted by raffriff42 View Post
    @ hello_hello, the question was about analog cables. Your post seems kinda off topic.
    I'd agree the question had a distinct "especially for analogue" flavour, however.....

    Originally Posted by CursedLemon View Post
    YPbPr or YCbCr are always derived from an RGB source at some point
    Both types were included. https://en.wikipedia.org/wiki/YPbPr

    YPbPr = Analogue
    YCbCr = Digital

    Mind you I seem to have consistently used the wrong one in my previous post, so I fixed it. Unfortunately pretending it never happened isn't an option as jagabo has already quoted some of it.
    Last edited by hello_hello; 16th Oct 2016 at 11:30.
    Quote Quote  
  12. Originally Posted by hello_hello View Post
    YPbPr = Component
    YCbCr = Digital
    Nope YPbPr is analog representation for digital YCbCr and both are component (same as RGB which is also component).
    All mentioned by you problems with limited or full quantization range are purely related to software.
    HDMI can use full or limited quantization range (DVI only full), even in limited quantization mode values lower than 16 and higher than 235(240) are fully acceptable as natural consumer signal may have over and undershoot.
    Quote Quote  
  13. Originally Posted by pandy View Post
    Originally Posted by hello_hello View Post
    YPbPr = Component
    YCbCr = Digital
    Originally Posted by pandy View Post
    Nope YPbPr is analog representation for digital YCbCr and both are component (same as RGB which is also component).
    I originally labelled them Component and HDMI, but changed the latter to Digital. They're both politically correct now, I hope.

    Originally Posted by pandy View Post
    All mentioned by you problems with limited or full quantization range are purely related to software.

    HDMI can use full or limited quantization range (DVI only full), even in limited quantization mode values lower than 16 and higher than 235(240) are fully acceptable as natural consumer signal may have over and undershoot.
    That's a lucky coincidence, as every computer I've ever used has run software.
    Which part of my post gave you the impression I didn't think HDMI could use either full or limited range?

    I'd have to voice my disagreement over the "DVI only full" claim. Technically it's no doubt true, but it's not in practice. Nvidia don't seem to distinguish between VGA/DVI and HDMI outputs. Take my old 8600GT as an example. Two DVI/VGA outputs (either output can be VGA or DVI through the use of an adaptor), but it's a HDMI free zone. Not a single HDMI output to be found. It's connected to my TV via a DVI to HDMI cable and behaves HDMI-like just as I described. I guess it's possible, if the card can determine the type of connector on the display end of the cable, when it's a DVI connector the card could change it's behaviour and become DVI complaint, but I find that a little unlikely. VGA is also technically full range, but an Nvidia card outputs limited range over VGA by default. You have to tell it not to. That behaviour may have changed in recent times. I don't know.

    I remember why the full range setting for the output connected to my TV doesn't survive a reboot. It's because the Nvidia drivers don't care about the type of output. They determine the type of display according to the resolution and refresh rate and so they reset the TV output to limited range each time. That's a true story. This PC is connected to a CRT monitor via the primary output, and to the TV via VGA and the secondary output. It remembers the full range setting for the CRT but not the TV.
    Over on the other desk is the second PC connected to the TV via DVI/HDMI and the primary output, and to a CRT via VGA and the secondary output. It also remembers the full range setting for the CRT but resets the TV output each reboot.

    I kind of remember there's a registry setting that supposedly fixes the problem, but I never got around to finding it. I'm so in the habit of changing the setting each boot I rarely forget. It only takes a couple of seconds and on average I only reboot once a week anyway.

    I should add.... XP can only connect to the TV at 50/60Hz. There's no 59.940Hz option. I believe newer Windows/drivers understand both. I strongly suspect if I was running XP and newer drivers I could connect the PC to the TV at 60Hz and the full range setting would stick. Connecting at 59.940Hz... well that's a TV so it'd probably reset. The TV goes into PC mode when you connect to the special HDMI input at 60Hz/1080p. It won't at 50Hz. It probably won't at 59.940hz either. It'd insist on remaining a TV. That's not a theory I can test just now, but what's a bit of wild speculation between friends.
    Last edited by hello_hello; 16th Oct 2016 at 12:59.
    Quote Quote  
  14. Originally Posted by hello_hello View Post
    I originally labelled them Component and HDMI, but changed the latter to Digital. They're both politically correct now, I hope.
    YPbPr=YCbCr (both are component only one is analog signal after converting from YCbCr which is digital representation of the same signal).

    Originally Posted by hello_hello View Post
    That's a lucky coincidence, as every computer I've ever used has run software.
    Which part of my post gave you the impression I didn't think HDMI could use either full or limited range?
    Well if i misunderstand you then please accept my apologies - i only refereed to HDMI specification.
    Originally Posted by hello_hello View Post
    I'd have to voice my disagreement over the "DVI only full" claim. Technically it's no doubt true, but it's not in practice. Nvidia don't seem to distinguish between VGA/DVI and HDMI outputs. Take my old 8600GT as an example. Two DVI/VGA outputs (either output can be VGA or DVI through the use of an adaptor), but it's a HDMI free zone. Not a single HDMI output to be found. It's connected to my TV via a DVI to HDMI cable and behaves HDMI-like just as I described. I guess it's possible, if the card can determine the type of connector on the display end of the cable, when it's a DVI connector the card could change it's behaviour and become DVI complaint, but I find that a little unlikely. VGA is also technically full range, but an Nvidia card outputs limited range over VGA by default. You have to tell it not to. That behaviour may have changed in recent times. I don't know.
    I can tell only what specification says - DVI is unable to distinguish between full and limited quantization - DVI simply working always with assumption of full quantization range but - but DVI displays (formally called Sink) are frequently capable to work in HDMI mode as such all functionality unique for HDMI applied to the.
    I can tell that i have NVidia card connected to TV and i can select between full and limited quantization mode if RGB output color space is selected, when YCbCr is selected there is no possible to select full quantization output.


    Originally Posted by hello_hello View Post
    I remember why the full range setting for the output connected to my TV doesn't survive a reboot. It's because the Nvidia drivers don't care about the type of output. They determine the type of display according to the resolution and refresh rate and so they reset the TV output to limited range each time. That's a true story. This PC is connected to a CRT monitor via the primary output, and to the TV via VGA and the secondary output. It remembers the full range setting for the CRT but not the TV.
    Over on the other desk is the second PC connected to the TV via DVI/HDMI and the primary output, and to a CRT via VGA and the secondary output. It also remembers the full range setting for the CRT but resets the TV output each reboot.
    Have no this kind of issue - unless changed in settings, NVidia keep chosen quantization range. Dell T5500 with NVidia Quadro 2000 (Win 7 Pro Dell OEM 64 bit).

    Originally Posted by hello_hello View Post
    I kind of remember there's a registry setting that supposedly fixes the problem, but I never got around to finding it. I'm so in the habit of changing the setting each boot I rarely forget. It only takes a couple of seconds and on average I only reboot once a week anyway.

    I should add.... XP can only connect to the TV at 50/60Hz. There's no 59.940Hz option. I believe newer Windows/drivers understand both. I strongly suspect if I was running XP and newer drivers I could connect the PC to the TV at 60Hz and the full range setting would stick. Connecting at 59.940Hz... well that's a TV so it'd probably reset. The TV goes into PC mode when you connect to the special HDMI input at 60Hz/1080p. It won't at 50Hz. It probably won't at 59.940hz either. It'd insist on remaining a TV. That's not a theory I can test just now, but what's a bit of wild speculation between friends.
    IMHO OS is not responsible for this - i made lot of various weird tests on XP and OS perfectly accepting new custom mode with non standard refresh rates (like 280Hz interlaced) - i would rather consider this as graphic card driver limitation or other problem (normally HDMI not distinguish between 59.954 and 60Hz - they share same short video ID - difference is in pixel clock but this mean HDMI source is responsible for proper timing).
    Quote Quote  
  15. Chicken McNewblet
    Join Date
    Sep 2009
    Location
    United States
    Search Comp PM
    Originally Posted by betwixt View Post
    I think it's because of historical reasons. RGB requires three signals (obviously) but when encoded to a composite signal, the 'Y' signal is created by summing RG and B then 'G' is disposed of. 'Y' is the luiminance information which generally needs higher resolution because our eyes have far sharper impression of brightness than color and therefore needs more bandwidth to process it. It is also the only signal needed by monochrome monitors. For recording purposes, the color information can be stored at lower resolution (= easier) and the 'G' restored at final playback as G = Y-(R+B).

    When public demand was for higher resolution, it was more practical to continue using the luminance and color difference signals as they already existed and were well catered for in consumer devices. They gave it the name "S-Video". With almost everything being digital these days, the need for difference signals is less important but still used in legacy applications.

    I omitted a factor called 'weighting' in the calculations for simplicity, in real life the individual color signals are not summed or restored equally.

    Brian.
    Originally Posted by pandy View Post
    All computer equipment is RGB, from some strange reason USA promoted YPbPr as consumer video component format.
    Some people (videophiles?) support this as according to them YPbPr offer higher quality than RGB.
    And nowadays it is too late for such dispute as analog video (also component) died replaced by HDMI and similar digital interfaces.
    So this discussion may have place 15 years ago but now it is too late.
    Interesting, I suspected it was a matter of convenience more than anything.

    So I'm guessing that the computer industry decided to go with straight RGB since, at the time, they didn't have any crossover between television technologies and computer technologies, and therefore had no reason to try to adapt to some kind of human perception coding?
    Quote Quote  
  16. Member
    Join Date
    Aug 2013
    Location
    Central Germany
    Search PM
    PC video output did not need to modulate colors to a single analogue signal, color monitors with RGB "phosphors" could receive discrete signals per component at least since VGA (DE-15 connector; with limited variety in EGA too, DE-9 connector).

    And to display "TrueColor" videos, it even required the development of "SuperVGA" graphic cards. Before, there was hardly any reason to encode videos for PCs in more than 256 palette colors, hence already displaying MPEG-1 alike video would have been a challenge to the video hardware available for IBM compatible PCs.

    The era of videos on a PC started with the development of the VESA standard to create a common technology, making dozens of chipset specific different graphic mode enhancements obsolete, especially by providing a common "memory-mapped" access method for video card memory in protected mode (64 KB banking would have been way too slow for fluid video output of remarkable resolutions).

    Until that, rather think of animated GIFs or even simpler technologies (most famous: Autodesk FLIC), usually with cartoon-like content, because a limited color palette is more efficient to store in small sizes (remember, a floppy disk had less than 1.5 MB capacity). Of course, color palettes were designed in RGB triplets. There was no need for YUV, as there was not even any TrueColor (RGB24) graphic mode yet.
    Quote Quote  
  17. Originally Posted by pandy View Post
    I can tell that i have NVidia card connected to TV and i can select between full and limited quantization mode if RGB output color space is selected, when YCbCr is selected there is no possible to select full quantization output.
    It works the same way for me too. The TV lets me change it's YCbCr levels but the PC won't.

    Originally Posted by pandy View Post
    Have no this kind of issue - unless changed in settings, NVidia keep chosen quantization range. Dell T5500 with NVidia Quadro 2000 (Win 7 Pro Dell OEM 64 bit).
    I can't check the other PC at the moment as I've put Linux on it which makes it hard to determine what's happening in respect to Windows, but I'll probably do some cable swapping later today or tomorrow so I can connect this one via HDMI.
    It's very likely you don't have the same issue because you're not running XP and the world has changed a bit since then. It's possibly just Nvidia drivers. I'm not a gamer so after a couple of bad driver upgrade experiences I stopped doing it. The drivers I'm using are version 6.14.12.9573 dated February 2012. The last time I tried more recent drivers, which was a couple of years ago, I'm pretty sure Nvidia messed up the limited to full range conversion. The blacks were extremely crushed with or without it enabled, so I went back to the older drivers, which have never given me a problem aside from the full range setting not surviving a reboot, and I decided to stick with that until a build a new PC.

    Once this PC is connected via HDMI I'll play around, but yes, a custom resolution might make a difference.
    I did try creating a custom resolution refreshing at 59Hz over VGA, and much to my surprise it appears to have worked. ReCklock claims the TV is currently refreshing at 59.093Hz. And that brought some more memories back.... whether they be real or imagined..... but the resolution/refresh rate combinations are divided into sections in the Nvidia control panel. PC, TV, and Custom (if you create one). And I recall they have something to do with the way things behave... depending on whether you select a resolution from the PC or TV list.
    I can't remember if I've ever seen the TV list while running XP. There's definitely not one at the moment with both outputs being VGA. Although if the card thinks the TV is a PC, logically it shouldn't keep switching the levels back to limited range. Both PCs do it, and for one of therm the TV is the secondary monitor and for the other it's the primary. Do you have both PC and TV resolutions in the Nvidia Control Panel and is the card using any of the TV resolutions?

    Click image for larger version

Name:	nvidia.gif
Views:	609
Size:	32.4 KB
ID:	38934
    Quote Quote  
  18. Once again have no issues and settings are stored and used after reboot so no additional actions from my side required - i have list of video modes from 23.97 to 60Hz - all of them are supported by TV, i can select between YCbCr and RGB and for RGB i can select between full and limited quantization mode (normally i use full quantization mode) bit depth is only 8 bit.

    Image
    [Attachment 38942 - Click to enlarge]
    Quote Quote  
  19. "3. Apply the following settings".
    The "display" section doesn't have output level options for XP, so there's no problem with the setting sticking because you can't change it. You can change the refresh rate and bit depth, but that's it. I've been referring to the levels setting in the "video" section. Maybe it's possible for the video card to see a TV as a TV and choose limited range for displaying windows, but I don't think I've experienced it. The "display" section will let me switch to YCbCr if the output is DVI/HDMI, but when it's VGA the RGB/YCbCr setting disappears from the control panel.

    The video section is where I can adjust levels but it only effects video. The levels used by Windows itself are full range and for the TV the VGA input is fixed at full range..... yet for NVidia the video levels have always defaulted to limited range even for VGA , and that's never made sense to me.

    Click image for larger version

Name:	nvidia.gif
Views:	506
Size:	64.2 KB
ID:	38949

    Can you set levels independently for video or has it been moved to the display section? My recollection is originally the "display" and "video" levels could be adjusted independently, but it was a bit messy. I remember forum posts where people were complaining about the video levels option not working as expected. You may be lucky and the drivers see your TV as a monitor and don't argue about running full range, but I still suspect the 1080p resolution convinces the drivers I'm using to switch the video back to limited range each boot even though Windows itself is full range. Or it's just a very long standing bug.

    There's an article here explaining how the drivers determine the default levels based on resolution and refresh rate and how to work around bad choices. I'm fairly sure it's a few years old and maybe Nvidia have found ways to make it more reliable. I don't know as I haven't played with drivers for newer Windows much.
    https://pcmonitors.info/articles/correcting-hdmi-colour-on-nvidia-and-amd-gpus/

    I tried switching to YCbCr444 at one strange but went back to RGB. I think video looked fine and I don't recall there being a difference in video quality, but the colours of Windows itself were a bit funky. The article I linked to mentions colour differences between RGB and YCbCr444 for Nvidia cards.

    When my TV sees YCbCr444 it switches to expecting limited range and there's nothing I can do about it, unless I'm connected to the TVs PC/HDMI input and I coax it into switching to PC mode. When it does the picture options are restricted to the same as for the VGA input. Most of the picture enhancing stuff is disabled, overscanning is disabled, you can only connect at 60hz (I'm pretty sure) and in PC mode it's the only input on the TV that'll let you select full range for YCbCr44. I've never understood why.

    Tomorrow I'll try to experiment a little via HDMi to see if slight refresh rate or resolution adjustments can cause the video card to output different levels. I'll try to do it after installing Win7 on a spare drive so I'm not testing with old drivers and an older OS.
    Last edited by hello_hello; 17th Oct 2016 at 15:27.
    Quote Quote  
  20. This post may be a little too late, but RGB has always been 444 (sometimes an alpha too) which is a very fat in terms of bitrate and hugely inefficient for viewing purposes. There just isn't a lot to be gained from broadcasting 444. Like it or not, we live in a bandwidth limited world, so trade-offs must be made, even for the cables. If you don't believe me, try editing an HD 16-bit TIFF sequence and you will quickly understand what bandwidth limitations are all about. And back in the day, NTSC/PAL video didn't have the bandwidth to carry a full 444 signal, so 411 was born—all the luma and only a quarter of each chroma channel. Believe it or not, this actually maps pretty well to how humans see. Our rods are a lot more sensitive than our cones. So engineers merely took advantage of this to create a highly compressed format that looked acceptable. It is the same reason interlaced video was invented, bandwidth limitations. When HD came along they switched to 420, but still bandwidth limited. Maybe there is a future where we won't be limited by bandwidth, and we will watch all our favorite films in 444 uncompressed 16K raw glory.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!