I have seen in PC forums that HDR is nonsense on a PC. It only works for movies like from a Blu-Ray player. The PC already does what HDR does for games and the desktop.
What are the thoughts from users on this site? There are more video experts here.
+ Reply to Thread
Results 1 to 5 of 5
Thread
-
-
Although it is easier to use a Blu-ray player, it is certainly possible to play video files with HDR on a PC. I have done it. However, there are specific technical requirements that need to be met.
Windows 10 or Windows 11 is required. The Windows settings related to HDR need to be turned on. The video must have HDR10. The software player must support HDR10. Your PC must have a GPU that supports HDR10. The display connected to your PC must support HDR10. The PC and the monitor/TV must be connected via HDMI 2.0 (or better) or DisplayPort 1.2 (or better).
[Edit]Oh I forgot, to play HEVC, VP9, or AV1 (especially at UHD resolutions) your GPU should support hardware decoding for the format you want to play.
If you don't have a monitor or TV that supports HDR10, VLC and the Windows Movies and TV App can use tone-mapping to simulate HDR on a non-HDR display.Last edited by usually_quiet; 25th Feb 2022 at 23:27.
Ignore list: hello_hello, tried, TechLord, Snoopy329 -
Does Windows 10 or Windows 11 have a true 10-bit graphics infrastructure? As opposed to faking 10 bits by stretching 8 bits?
How would one activate 10-bit video in Windows? Of course you'd need a 10-bit graphics card and monitor. And which software players handle true 10 bits?
This looks interesting:
https://www.bhphotovideo.com/c/product/1044784-REG/blackmagic_design_bdlkstudio4k_deck...studio_4k.html -
-
The drivers for some AMD or NVIDIA discrete video cards provide a user interface that allows them to be set up to output 10-bit color. I am guessing that an 8-bit + FRC monitor itself would be responsible for any dithering that takes place rather than the GPU.
My understanding is that when Windows 10/11 detects that the attached display supports 10-bit color depth input and that the GPU supports 10-bit color-depth video output, it automatically outputs video with 10-bit color depth if the video being played has 10-bit color depth.
[Edit]
I read that VLC, PotPlayer, and PowerDVD do support 10-bit color. Most of the well-known software players that are still being updated probably support 10-bit color now.
[Edit 2]
Information about Windows and deep color 10/12-bit support using Intel Graphics (an Intel iGPU): https://www.intel.com/content/www/us/en/support/articles/000057279/graphics.htmlLast edited by usually_quiet; 22nd Sep 2022 at 21:10.
Ignore list: hello_hello, tried, TechLord, Snoopy329
Similar Threads
-
Any solution to transform HDR video to Youtube compatible HDR?
By Truthler in forum Video ConversionReplies: 2Last Post: 20th Jul 2021, 12:37 -
What is the best way to resize 4k HDR -> 1080p HDR? Are there gotchas?
By Msuix in forum Video ConversionReplies: 2Last Post: 1st Mar 2021, 03:58 -
MPC-BE HDR media in non HDR screens. Oversaturated picture.
By Zapa in forum Software PlayingReplies: 4Last Post: 29th Dec 2019, 12:31 -
4K x265 remove HDR but, keep HDR settings as a backup.
By Someonecool in forum Video ConversionReplies: 4Last Post: 22nd Oct 2018, 23:04 -
HEVC HDR to x264 non-hdr
By mmace in forum Video ConversionReplies: 3Last Post: 20th Nov 2017, 11:19