VideoHelp Forum




+ Reply to Thread
Results 1 to 29 of 29
  1. Member
    Join Date
    May 2015
    Location
    Dublin, Ireland
    Search PM
    Hi guys,

    Can someone tell me why some uncompressed video formats employ chroma subsampling and are still referred to as uncompressed? I'm thinking of YUV 4:2:2 10 bit uncompressed codecs like v210. i've never heard a satisfactory explanation and i can never find the right google terms to give an answer!

    Many thanks,

    Kieran.
    Last edited by kieranjol; 16th May 2015 at 06:05.
    Quote Quote  
  2. Because it's not compression in the sense of removing redundancy.
    Quote Quote  
  3. Member
    Join Date
    May 2015
    Location
    Dublin, Ireland
    Search PM
    Thanks Jagabo. Is it a case of specific definitions of subsampling vs compression? It would seem that subsampling is a form of redundancy removal as well. In the case of 4:2:2, the 50% chroma loss is not easily discernable vs 4:4:4:4 so it can be removed?
    Quote Quote  
  4. Why do you care whether it's called compression or not. It is what it is. And chroma subsampling is easily visible with the right source material.

    https://forum.videohelp.com/threads/294144-Viewing-tests-and-sample-files?p=1792760&vie...=1#post1792760

    What's gone is gone and cannot be fully restored. Every time you view a 4:2:2 or 4:2:0 video you are upsampling to 4:4:4 RGB.

    If I cropped half the frame away would you want to consider that compression too?
    Last edited by jagabo; 16th May 2015 at 08:04.
    Quote Quote  
  5. Member
    Join Date
    May 2015
    Location
    Dublin, Ireland
    Search PM
    The reason I care is that I simply want to understand the distinction. I had a naive view that using an uncompressed codec meant that nothing was lost when ingesting, for example, a tape or film. I'm still trying to wrap my brain around what you're saying. In my limited understanding, I would have thought that for a codec to be classed as uncompressed, it would have to be 4:4:4, as reducing the chroma information is compressing the signal.
    Quote Quote  
  6. Uncompressed means raw. Compression turns raw information into numbers, coefficients, symbols, words ect. using various algorithms. Lossless compression allows us to restore original data. Lossy compression is not precise and during compression we loose some or the original information so during decompression we can not restore original data.

    Raw video signal can be in various formats, RGB, YV12 (4:2:0), YV16 (4:2:2). It can bi 8-bit, 10-bit, 16-bit, ect.

    So, discarding chroma resolution and/or luma resolution has nothing to do with compression as data will still be in a raw format.

    Reason you don't understand very well is because you don't understand term "compression". And I can not explain it better as my english is a bit limited. Just search for term "compression" to understand better.
    Quote Quote  
  7. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Chroma subsampling is obviously a form of compression., in fact it is a lossy compression. It uses the fact that the human vision is less sensitive to color resolution.

    Compression techniques in general use various psycho-visual principles, e.g. some things we just notice more that others, that could be in color, intensity, movement or combinations thereof. That color is compressed separately from other compression was understandable for historical reasons. However in the 21th century it is definitely not the optimum. Color compression should be handled by the CODEC not by using the chroma subsampling machete where it cuts color resolution regardless of context.

    Of course if some people simply repeat the mantra "it's not compression, it's not compression.....( repeat 100 times) they eventually believe it themselves.

    "It's not true, it's not true, newpball is wrong, he is wrong" (repeat 1000 times).



    Last edited by newpball; 16th May 2015 at 11:58.
    Quote Quote  
  8. Formerly 'vaporeon800' Brad's Avatar
    Join Date
    Apr 2001
    Location
    Vancouver, Canada
    Search PM
    My take would be that interlacing and chroma subsampling are both forms of "compression" but not in the same sense as "data compression". And generally it's understood that "uncompressed" refers to the latter sense, so there's no need to throw in more words.

    Of course, if you make your definition too broad then downsizing the entire video could also be considered "compression".

    Ultimately I suppose the reason for the terminology is that when 4:2:2 was first introduced, there was no need to call it "compressed" to differentiate it if from anything else. If you were a producer in 1986 mastering on the bleeding-edge D-1 tape format, you didn't have a 4:4:4 tape somewhere that was considered the unmolested source material. I think you were just happy that you could do nth-generation dubs without losing quality.
    Quote Quote  
  9. Member
    Join Date
    May 2015
    Location
    Dublin, Ireland
    Search PM
    Originally Posted by Detmek View Post
    Uncompressed means raw. Compression turns raw information into numbers, coefficients, symbols, words ect. using various algorithms. Lossless compression allows us to restore original data. Lossy compression is not precise and during compression we loose some or the original information so during decompression we can not restore original data.

    Raw video signal can be in various formats, RGB, YV12 (4:2:0), YV16 (4:2:2). It can bi 8-bit, 10-bit, 16-bit, ect.

    So, discarding chroma resolution and/or luma resolution has nothing to do with compression as data will still be in a raw format.

    Reason you don't understand very well is because you don't understand term "compression". And I can not explain it better as my english is a bit limited. Just search for term "compression" to understand better.
    Thank you Detmek, I do have difficulty understanding compression. I think I've reached the limits of my brainpower!This might be semantics, but I thought "RAW" video was something different to uncompressed. As in RAW has no processing whatsoever, and may not even be readable without some sort of filtering.

    http://en.wikipedia.org/wiki/Raw_image_format
    Quote Quote  
  10. Originally Posted by Detmek View Post
    Uncompressed means raw. Compression turns raw information into numbers, coefficients, symbols, words ect. using various algorithms. Lossless compression allows us to restore original data. Lossy compression is not precise and during compression we loose some or the original information so during decompression we can not restore original data.

    In the professional video and photography world - "RAW" has different meaning than "uncompressed" . "RAW" usually means undebayered , unfiltered , sensor data

    It's not the same thing - these are independent terms - You can have "compressed" raw, or "uncompressed" raw . For example, Recode RAW is an example of lossy compressed raw.



    Originally Posted by newpball View Post
    Color compression should be handled by the CODEC not by using the chroma subsampling machete where it cuts color resolution regardless of context.

    It should be handled by the person, not the codec. A codec might make the wrong choice for a certain situation.

    Yes, in this imaginary world - ideally you should have control over everything. But in real world, there is cost $ as part of the consideration

    Even up to couple years ago, the lowest cost of entry for full 4:4:4 was about $20,000 for the Sony F3 with RGB upgrade (it's eventually became a free upgrade) . And that isn't even full 4:4:4 when measured on chroma zone plates (sensor could not deliver it) . Nowadays, it's a lot cheaper, because UHD 4:2:0 has 1920x1080 CbCr color information. You can just downscale the Y' and you have 4:4:4 1080p . That's one of the huge benefits of oversampling
    Quote Quote  
  11. Member
    Join Date
    May 2015
    Location
    Dublin, Ireland
    Search PM
    Originally Posted by newpball View Post
    Chroma subsampling is obviously a form of compression., in fact it is a lossy compression. It uses the fact that the human vision is less sensitive to color resolution.

    Compression techniques in general use various psycho-visual principles, e.g. some things we just notice more that others, that could be in color, intensity, movement or combinations thereof. That color is compressed separately from other compression was understandable for historical reasons. However in the 21th century it is definitely not the optimum. Color compression should be handled by the CODEC not by using the chroma subsampling machete where it cuts color resolution regardless of context.

    Of course if some people simply repeat the mantra "it's not compression, it's not compression.....( repeat 100 times) they eventually believe it themselves.

    "It's not true, it's not true, newpball is wrong, he is wrong" (repeat 1000 times).



    Love the Tom Green gif, and thanks for the reply! It's interesting that you say that the codec isn't handling the colour compression. By this do you mean that colour compression could be part of the redundancy removal that is already carried out by a codec? This is painting a clearer picture in my head of how colour compression is handled differently.
    Quote Quote  
  12. Member
    Join Date
    May 2015
    Location
    Dublin, Ireland
    Search PM
    Originally Posted by poisondeathray View Post


    In the professional video and photography world - "RAW" has different meaning than "uncompressed" . "RAW" usually means undebayered , unfiltered , sensor data

    It's not the same thing - these are independent terms - You can have "compressed" raw, or "uncompressed" raw . For example, Recode RAW is an example of lossy compressed raw.


    This was my understanding as well.


    Originally Posted by poisondeathray View Post
    Nowadays, it's a lot cheaper, because UHD 4:2:0 has 1920x1080 CbCr color information. You can just downscale the Y' and you have 4:4:4 1080p . That's one of the huge benefits of oversampling
    I've seen a blog where someone converted 4k 8-bit to 1080p 10-bit from a GH4. It was EOSHD I believe.
    Quote Quote  
  13. Originally Posted by kieranjol View Post

    I've seen a blog where someone converted 4k 8-bit to 1080p 10-bit from a GH4. It was EOSHD I believe.
    UHD, not 4K .

    Mathematically, you do get 10bit, at least in greyscale. But in practice, it's not so simple, and you actually end up with slightly less than 10bit values (but very close). Part of the problem is the UHD CbCr channels are 8bit 1920x1080 , so that causes quantization or discrete steps. The conversion has to be dithered and done properly for you to see a difference
    Quote Quote  
  14. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    There was a thread over at Doom9 with BenWaggoner & DarkShakiri in it that talked about what newpball was proposing, and after much tangential bantering, the upshot of it all was that, as far as they could discern (and it seemed like opposing camps came to agreement), using the anti-redundancy tools in a codec such as x264 did indeed reduce the bitrate/filesize better for a given quality (or quality for a given bitrate/filesize) than color subsampling.
    But, and here is the important part: it was more efficient ONLY for high/very high bitrates.
    For medium/low/very low bitrates (of which, 90%+ of the users here qualify), chroma subsampling was more efficient!

    Maybe that will put it to rest (but I doubt it).

    ******************

    As far as the OP is concerned with definitions, I think you're going to have to just go along with the professionally accepted categorizations, or you'll always be at odds with the full understanding of what's being discussed.

    There are (or can be) various stages video goes through in its journey (from scene to audience):
    Optical capture
    Analog Electrical transform
    Raw Digitizing (Sampling), and optional color subsampling
    Convert to Standardized forms (RGB, YUV), and optional color subsampling
    Intraframe Transform & Bitrate reduction (aka "compression")
    Interframe compression

    And then it reverses the process.

    Note that with the "standardized form" of RGB or YUV, you could still do a direct "paint by numbers" on a grid and see a visible, understandable image. Once the next transform stage is done, the numbers that describe the image are no longer directly human-eye representational.
    I believe this is the demarcation you might be looking for between color-subsampling & compression. In a broad sense there are similarities, but much of what is done with video & audio must dig deeper than broad generalities in order to be useful.

    Scott
    Quote Quote  
  15. Originally Posted by newpball View Post
    Chroma subsampling is obviously a form of compression., in fact it is a lossy compression. It uses the fact that the human vision is less sensitive to color resolution.
    If I take a colour image and remove all the chroma information so what I'm left with is black and white, have I compressed it, or have I created something different by removing the colour information?

    If I take a 16 bit wave file and re-sample it as 8 bit, obviously I've compressed it in exactly the same way subsampling didn't. I think you're confused about the difference between the amount of data available for compression and the compression itself.

    Originally Posted by newpball View Post
    Of course if some people simply repeat the mantra "it's not compression, it's not compression.....( repeat 100 times) they eventually believe it themselves.
    I saw something similar in another thread recently where people were claiming turning the volume up and down automatically was compression and not normalisation. Like then, what constitutes compression here is probably largely semantics too.

    Originally Posted by newpball View Post
    "It's not true, it's not true, newpball is wrong, he is wrong" (repeat 1000 times)

    A moderator should make that a sticky. Would you mind if I quoted you and used it as a forum signature?
    Quote Quote  
  16. Originally Posted by kieranjol View Post

    Thank you Detmek, I do have difficulty understanding compression. I think I've reached the limits of my brainpower!This might be semantics, but I thought "RAW" video was something different to uncompressed. As in RAW has no processing whatsoever, and may not even be readable without some sort of filtering.

    http://en.wikipedia.org/wiki/Raw_image_format
    Originally Posted by poisondeathray View Post


    In the professional video and photography world - "RAW" has different meaning than "uncompressed" . "RAW" usually means undebayered , unfiltered , sensor data

    It's not the same thing - these are independent terms - You can have "compressed" raw, or "uncompressed" raw . For example, Recode RAW is an example of lossy compressed raw.
    Yes, raw is not really a correct term but I could not find better one. So, trying to explain.

    In audio world PCM is uncompressed format. It does not require to be treated in any way so applications can work with it. And every lossless (FLAC, ALAC, WavPack) or lossy compressed format (MP3, AAC, Vorbis) has to be uncompressed into PCM in order to be played back or edited.

    In video world video needs to be in uncompressed format so you can play it back or edit without any further modification. So, every lossless or lossy compressed video needs to be decompressed in order to be played back or edited.

    Chroma subsampling is related to YUV video format where luma and chroma are separated. In that way luma and chroma can have different resolution. For YUV 4:4:4 luma and chroma have the same resolution. For YUV 4:2:0 luma and chroma have different resolution. To convert YUV 4:4:4 to YUV 4:2:0 you need to resize chroma. And that is not compression. You can resize by discarding pixels or interpolating pixels which will create something new but it is not compression.

    hello_hello's example also works as explanation.

    It's not true, it's not true, newpball is wrong, he is wrong. x1000
    Quote Quote  
  17. DECEASED
    Join Date
    Jun 2009
    Location
    Heaven
    Search Comp PM
    Originally Posted by Detmek View Post
    .............

    Chroma subsampling is related to YUV video format where luma and chroma are separated.
    Yes, it started with /was born of the YUV video format, but it can be used in the 'RGB world' as well.
    For example, there are JPG images whose sources were not converted to YCbCr before being compressed;
    and in this case, the channels red and blue may be subsampled.

    It's not true, it's not true, newpball is wrong, he is wrong. x1000
    Couldn't agree more
    Quote Quote  
  18. Originally Posted by Detmek View Post

    Chroma subsampling is related to YUV video format where luma and chroma are separated. In that way luma and chroma can have different resolution. For YUV 4:4:4 luma and chroma have the same resolution. For YUV 4:2:0 luma and chroma have different resolution. To convert YUV 4:4:4 to YUV 4:2:0 you need to resize chroma. And that is not compression. You can resize by discarding pixels or interpolating pixels which will create something new but it is not compression.
    Analog video signal world is source for reduced color (chroma) signal bandwidth - digital world is just recreation of this idea with various flavors of sampling (frequency, position etc).
    Quote Quote  
  19. Member
    Join Date
    May 2015
    Location
    Dublin, Ireland
    Search PM
    Originally Posted by Detmek View Post

    Chroma subsampling is related to YUV video format where luma and chroma are separated. In that way luma and chroma can have different resolution. For YUV 4:4:4 luma and chroma have the same resolution. For YUV 4:2:0 luma and chroma have different resolution. To convert YUV 4:4:4 to YUV 4:2:0 you need to resize chroma. And that is not compression. You can resize by discarding pixels or interpolating pixels which will create something new but it is not compression.

    hello_hello's example also works as explanation.

    It's not true, it's not true, newpball is wrong, he is wrong. x1000
    I think this has really clarified it. It does appear to be a resizing. To be very crude about it, one could say that with a PAL Uncompressed 4:2:2 video, the Y channel is 720x576, but U and V are both effectively stored in the container before decompression as 360x288? I'm probably being too literal here, but that's the gist of it?
    Quote Quote  
  20. Member Skiller's Avatar
    Join Date
    Oct 2013
    Location
    Germany
    Search PM
    Originally Posted by kieranjol View Post
    To be very crude about it, one could say that with a PAL Uncompressed 4:2:2 video, the Y channel is 720x576, but U and V are both effectively stored in the container before decompression as 360x288?
    Yes, although for 4:2:2 it's 360x576. 360x288 is the size of the chroma planes in 4:2:0 subsampling.
    Quote Quote  
  21. @El Heggunte
    I didn't know that RGB can be subsampled. I know it can be full or limited range but not subsampled.

    @pandy

    Yep, subsampling was main way to same bandwidth in analog but it was useful even with digital video in early days. Now that we deal with 4K resolutions and high bitrate video streams it may be a time to stop using chroma subsample. Unfortunately, new UHD Blu-Ray standard does not support 4:4:4 chroma (unless they changed something in the last few months).
    Quote Quote  
  22. Member
    Join Date
    May 2015
    Location
    Dublin, Ireland
    Search PM
    Originally Posted by Skiller View Post
    Originally Posted by kieranjol View Post
    To be very crude about it, one could say that with a PAL Uncompressed 4:2:2 video, the Y channel is 720x576, but U and V are both effectively stored in the container before decompression as 360x288?
    Yes, although for 4:2:2 it's 360x576. 360x288 is the size of the chroma planes in 4:2:0 subsampling.
    Of course, thank you!
    Quote Quote  
  23. Originally Posted by Detmek View Post
    Yep, subsampling was main way to same bandwidth in analog but it was useful even with digital video in early days. Now that we deal with 4K resolutions and high bitrate video streams it may be a time to stop using chroma subsample. Unfortunately, new UHD Blu-Ray standard does not support 4:4:4 chroma (unless they changed something in the last few months).
    Well yes and no - don't forget that video have 3 dimensional structure - X, Y and time - there is few video standards that exploiting this - OK all of them was not popular, and now they are outdated but - time and bandwidth can be exchanged (at least under some limited constrains).

    Why BD UHD should be better than HDMI 2.0?
    HDMI 2.0 is limited to 60 fps, 8 bit and sometimes to 4:2:0 - looks like BD UHD spec is anyway beyond HDMI capabilities.

    To be honest i can accept consumer 4:2:2 but i prefer 4:4:4 however most of people will don't see difference even with 4:1:0 (or perhaps 2:1:0) - truth is that most of people dont see significant difference between 320x240 and 1920x1080 so... why bother?
    Quote Quote  
  24. Member
    Join Date
    Jan 2012
    Location
    Budapest
    Search Comp PM
    Originally Posted by jagabo View Post
    Because it's not compression in the sense of removing redundancy.
    Due to the color subsampling, the 1080p and 720p do not exist in the reality. What we see on the screen of a good quality blu-ray movie is only around 900 x 700 visible resolution
    Quote Quote  
  25. Originally Posted by Stears555 View Post
    Originally Posted by jagabo View Post
    Because it's not compression in the sense of removing redundancy.
    Due to the color subsampling, the 1080p and 720p do not exist in the reality. What we see on the screen of a good quality blu-ray movie is only around 900 x 700 visible resolution
    Well this statement is plainly unfair... depends purely from source quality and this can be different.
    Quote Quote  
  26. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Since the great majority of humans' perception of resolution is based on luma/luminence (aka "Y" in the YUV), and since the Y is NOT subsampled, you are quite incorrect in your assessment.

    Scott
    Quote Quote  
  27. Originally Posted by Stears555 View Post
    Due to the color subsampling, the 1080p and 720p do not exist in the reality. What we see on the screen of a good quality blu-ray movie is only around 900 x 700 visible resolution
    If that were completely accurate, shouldn't I be able to take a 1080p subsampled picture, convert it to RGB, then resize the RGB version down to 720p, and when it's unscaled to 1080p again, it should always look pretty much the same?
    Quote Quote  
  28. Originally Posted by Cornucopia View Post
    Since the great majority of humans' perception of resolution is based on luma/luminence (aka "Y" in the YUV), and since the Y is NOT subsampled, you are quite incorrect in your assessment.

    Scott
    Because the eye is far more sensitive to luma than to chroma, right? You can get away with subsampling chroma.

    I've seen at AVS (more than once) members say what they want is UHD downconverted to 1080p, for display on a 1080p OLED ( to get more chroma information per pixel). Not quite sure what to make of that. If I get UHD Blu-Ray, I want a UHD TV as well.
    Pull! Bang! Darn!
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!