VideoHelp Forum
+ Reply to Thread
Results 1 to 5 of 5
Thread
  1. I have seen in PC forums that HDR is nonsense on a PC. It only works for movies like from a Blu-Ray player. The PC already does what HDR does for games and the desktop.
    What are the thoughts from users on this site? There are more video experts here.
    Quote Quote  
  2. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by ChasVideo View Post
    I have seen in PC forums that HDR is nonsense on a PC. It only works for movies like from a Blu-Ray player. The PC already does what HDR does for games and the desktop.
    What are the thoughts from users on this site? There are more video experts here.
    Although it is easier to use a Blu-ray player, it is certainly possible to play video files with HDR on a PC. I have done it. However, there are specific technical requirements that need to be met.

    Windows 10 or Windows 11 is required. The Windows settings related to HDR need to be turned on. The video must have HDR10. The software player must support HDR10. Your PC must have a GPU that supports HDR10. The display connected to your PC must support HDR10. The PC and the monitor/TV must be connected via HDMI 2.0 (or better) or DisplayPort 1.2 (or better).

    [Edit]Oh I forgot, to play HEVC, VP9, or AV1 (especially at UHD resolutions) your GPU should support hardware decoding for the format you want to play.

    If you don't have a monitor or TV that supports HDR10, VLC and the Windows Movies and TV App can use tone-mapping to simulate HDR on a non-HDR display.
    Last edited by usually_quiet; 25th Feb 2022 at 23:27.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  3. Does Windows 10 or Windows 11 have a true 10-bit graphics infrastructure? As opposed to faking 10 bits by stretching 8 bits?

    How would one activate 10-bit video in Windows? Of course you'd need a 10-bit graphics card and monitor. And which software players handle true 10 bits?

    This looks interesting:

    https://www.bhphotovideo.com/c/product/1044784-REG/blackmagic_design_bdlkstudio4k_deck...studio_4k.html
    Quote Quote  
  4. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    The drivers for some AMD or NVIDIA discrete video cards provide a user interface that allows them to be set up to output 10-bit color. I am guessing that an 8-bit + FRC monitor itself would be responsible for any dithering that takes place rather than the GPU.

    My understanding is that when Windows 10/11 detects that the attached display supports 10-bit color depth input and that the GPU supports 10-bit color-depth video output, it automatically outputs video with 10-bit color depth if the video being played has 10-bit color depth.

    [Edit]
    I read that VLC, PotPlayer, and PowerDVD do support 10-bit color. Most of the well-known software players that are still being updated probably support 10-bit color now.

    [Edit 2]
    Information about Windows and deep color 10/12-bit support using Intel Graphics (an Intel iGPU): https://www.intel.com/content/www/us/en/support/articles/000057279/graphics.html
    Last edited by usually_quiet; 22nd Sep 2022 at 21:10.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!