VideoHelp Forum

Our website is made possible by displaying online advertisements to our visitors. Consider supporting us by disable your adblocker or Try ConvertXtoDVD and convert all your movies to DVD. Free trial ! :)
+ Reply to Thread
Results 1 to 11 of 11
Thread
  1. Member
    Join Date
    Aug 2019
    Search PM
    Hi people,

    I love reading your forum. It is very useful.

    Is there some fast way to check if video is truly 10-bit (not only encoded as 10-bit from an 8-bit source)?

    Regards,
    Sotee
    Quote Quote  
  2. Member
    Join Date
    Apr 2017
    Location
    England
    Search Comp PM
    AFAIK the only way is to check the source (i.e. off camera) file, because once it is transcoded the metadata is from the writing application.
    Quote Quote  
  3. Not without detailed inspection, probably quantization histogram would be best way - rule is simple 8 bit converted to 10 bit means that sample value is multiplied by 4 i.e. black level in 8 bit is expressed as 16 then in 10 bit it will be 64. Next 8 bit level 17 should be like 68 etc so you looking for simple dependency if all your samples values are dividable by 4 without fraction thus your source was 8 bit encoded as 10 bit but if you see sample values not dividable by 4 without fraction then there is a chance that your source was 10 bit or higher however 8 bit can be processed as 10 bit thus 10 bit values may be present as outcome of the signal processing. Personally I would look in area with smooth gradients - banding is one of most obvious signs that 8 bit quantization was used.
    Quote Quote  
  4. Member
    Join Date
    Aug 2019
    Search PM
    Thank you for the answers, it seems like video observation and comparison with 8-bit source is the only way.

    Any other tells besides banding, for example shading, lighting, faces?

    P.S. Generally, from my tests now, 10-bit is always a little bit more grainy when not denoised. So, scenes with a lot of grain in the background while sharp on foreground seems like a good choice to compare.
    Last edited by Sotee; 25th Aug 2019 at 06:27.
    Quote Quote  
  5. Originally Posted by Sotee View Post
    Thank you for the answers, it seems like video observation and comparison with 8-bit source is the only way.

    Any other tells besides banding, for example shading, lighting, faces?

    P.S. Generally, from my tests now, 10-bit is always a little bit more grainy when not denoised. So, scenes with a lot of grain in the background while sharp on foreground seems like a good choice to compare.

    Noise allow to hide real nature of source... 8 bit can be easily converted to 10 bit by adding noise and averaging outcome...
    Quote Quote  
  6. Member
    Join Date
    Aug 2019
    Search PM
    Originally Posted by pandy View Post
    8 bit can be easily converted to 10 bit by adding noise and averaging outcome...
    Yeah, but you can't add more information/quality, which is present with the noise of true 10-bit.
    Quote Quote  
  7. Originally Posted by Sotee View Post
    Is there some fast way to check if video is truly 10-bit (not only encoded as 10-bit from an 8-bit source)?
    I'll make it very easy for you, if you have 10bit video that's also a delivery format, such as x264 10bit, then the source was 8bit.

    There are camera's that record to AVC Intra 4:2:2 10bit, but no one distributes footage that comes straight from the camera, the footage is always edited in some way, cuts, transitions, grading, something and that is always outputted as a master for archiving, along with the source footage. The master is never in a delivery format, it will be in AVC Intra, HEVC Intra, ProRes, JPEG2000, incidentally depending on the camera can also be acquisition formats,

    The master is then used to create the delivery format product meant for public consumption, and this needs to be in a form that can be played back on as wide an array of hardware as possible, that includes tablets, smart phones, stand alone players, HTPC, video game consoles, either on optical media or streaming. The only way to ensure that is to aim for the lowest common denominator, which means 4:2:0 8bit mpeg-2 or AVC, though VP9 and HEVC is slowly creeping into the market.

    The only exception to this may be something like a commercial UHD Blu-ray, which may be encoded from the master as 4:2:0 10bit.
    Quote Quote  
  8. Member
    Join Date
    Aug 2019
    Search PM
    Originally Posted by sophisticles View Post
    x264 10bit, then the source was 8bit.
    What about x265 10bit? How can we know source is not 8bit?
    Quote Quote  
  9. Originally Posted by Sotee View Post
    Originally Posted by sophisticles View Post
    x264 10bit, then the source was 8bit.
    What about x265 10bit? How can we know source is not 8bit?
    Same thing applies to anything you see as x265 10/12 bit.

    Even simpler, if you are downloading videos from a torrent site, encoded as x264 10bit or x265 10/12 bit and the source was not a UHD BD, then you can be 100% sure the source was 4:2:0 8bit.

    Having said this, 10/12bit encoding from an 8bit source does have it's uses., such as reducing/eliminating banding, smoother color gradients, etc in the resultant re-encode.
    Quote Quote  
  10. Member
    Join Date
    Aug 2019
    Search PM
    Originally Posted by sophisticles View Post
    if you are downloading videos from a torrent site, encoded as x264 10bit or x265 10/12 bit and the source was not a UHD BD, then you can be 100% sure the source was 4:2:0 8bit.
    That's the problem exactly - I don't have the source, so I can't know whether it is UHD BD. But this brings other thoughts:
    - One may check whether UHD BD of the video exists at all
    - One may look for 2160p releases since these came from an UHD source

    Very fruitful discussion, thank you all.
    Quote Quote  
  11. Originally Posted by Sotee View Post
    Yeah, but you can't add more information/quality, which is present with the noise of true 10-bit.
    But perceived picture will be sharpen after adding noise... this is how brain and eyes works... thus adding noise will linearize quantizer and increase perceived sharpness - all this at a cost increased entropy and as such reduced compressibility.
    Quote Quote  



Similar Threads