I have two HDR400 monitors - Philip 328P6AUBREB. I thought I'd connected them both with Display Port 1.4 cables. I have a PS3 connected which I rarely use, and somehow I'd mixed up the cables and one of the monitors is DP and the other HDMI.
I tried to enable HDR on windows and it would only enable on one monitor, which is how I found out the monitor was connected via HDMI. So when both are connected via DP, I can't enable HDR. I thought DP cables supported HDR. I guess not, unless I'm doing something wrong.
The content looks the same on both monitors with or with HDR on or off. Faded, washed out, dull blueish grey.
I'm trying some HDR10 content MKV files for testing purposes.
The graphics card is an MSI RTX 2080 Ti Gaming X *Latest Driver). Both Monitors are 2K HDR400 32" IPS panels. Is my hardware the problem, or is it Windows configuration or something else?
+ Reply to Thread
Results 1 to 16 of 16
I'm new here. Please be nice
My HDR-capable monitor has no DP connections and my PC has no HDMI 2.0 ports, so the monitor is connected to my PC's DisplayPort 1.2a connection using a Monoprice DisplayPort 1.2a to 4K at 60Hz HDMI Active HDR Adapter #33125. If DisplayPort 1.2a supports HDR then DisplayPort 1.4 does too.
However, the review for the Philips 328P6AUBREB that I read says that particular monitor only supports HDR 10 when using its HDMI connection.
[Edit]After further investigation, it appears that DisplayPort 1.4 is the first DP version with the ability to provide HDR10 metadata defined in CTA-861.3, including static and dynamic metadata and the Rec. 2020 color space, for HDMI interoperability.
However, my computer's DP 1.2a connection can supply HDR10 metadata because Intel added the ability to provide HDR10 metadata to its implementation of DP 1.2a for Kaby Lake and Coffee Lake Core i processors. This change allowed the HDMI 2.0a connections that are available on some motherboards for these processor families to work correctly because the HDMI 2.0a connections are provided via an LSPCON using the DP path.
Last edited by usually_quiet; 15th Aug 2020 at 16:29. Reason: removed unintentional link to softwareIgnore list: hello_hello, tried, TechLord, Snoopy329
Thanks for the reply.
I have an MSI Creator X299 motherboard with a 10940X CPU. The Graphics card is the MSI RTX 2080 Ti Gaming X.
Ok, based on the review that would explain why HDR will only enable when I select the monitor that is connected via HDMI. The fact that DP 1.4 might support it is irrelevant to me because of limitations built into the monitor. Have I got that right?
So I have to get rid of Display Port and only use HDMI? I can live with that. I'm getting sick of DP moving my open windows to one screen after the monitors goes to sleep, and I don't use the PS3 that much anyway.
But back to image quality; even on the HDMI connected monitor with HDR turned on or off on Windows the image looks the same - dull, washed out, faded blueish-grey content.
A lot of information, which I must admit went over the top. For example, I don't know what LSOCON is, but I shall now go and research it.I'm new here. Please be nice
Ignore list: hello_hello, tried, TechLord, Snoopy329
I tracked down a whitepaper on HDR from Intel. It's a bit dated, and it is referring to the IGP but it explains LSPCON...
My graphics card only has 1 HDMI 2.0b. Looks like I'll have to buy an adapter.
Just placed an order for 2 adapters. Sheesh, Amazon must not want to sell to people. I had to enter my password, then a capture then a one time password. Oh well, all done. Amazingly, Amazon were the cheapest for this item. Normally Amazon Au are much, much more expensive than everyone else. I should receive the adapters in a month or so.
In the mean time, if you can recall how you got it working, I have one monitor I can play around with, so I'd be really appreciative for any guidance.I'm new here. Please be nice
PotPlayer, except 3D. But VLC couldn't do 3D either (the last time I tried, which was a few years ago).
Anyway, I get that all of those "come into play". Do you have some tips on how to configure things so it works?
GPU drivers are latest. Other than the HDR settings in the settings app, are there other tweaks you can suggest at that might help?I'm new here. Please be nice
VLC is not my preferred player either, but recent versions do work a lot better than you seem to believe.
You can install multiple video players without issues, in particular if you go for portable versions.
Maybe this is relevant ?:
If there is a good guide on PC HDR pass-thru, someone will post a link to it.
First of all, my monitor and Intel 630 graphics can only work with HDR-10. Other kinds of HDR don't work at all with my system. Also, my Dell monitor isn't a real 4K 10-bit display or even IPS. It is a 1920x1080 8-bit panel (actually 6 bits+FRC dithering) which can accept 3160p video at 59.97 fps, downscale that to 1920x1080 and perform tone mapping internally to approximate real HDR. It isn't an ideal set-up but it does let me watch commercial UHD Blu-ray discs using Power DVD Ultra 18.
I tried to figure out what I did to enable HDR and this is all I can come up with.
My monitor has a Movie HDR profile, which I selected to put the monitor in UHD HDR mode. Windows 10's Settings->Display->Resolution is set to 3840x2160. In Windows 10's Settings->Display->Windows HD Color settings, I have "Play HDR Games and apps" turned on and I have "Stream HDR Video" turned on. I have PotPlayer 200730(1.7.21280) installed, which I set up to allow the use of hardware decoding and encoding.
I right-clicked "The World In HDR 4K Demo.mkv" (downloaded from https://4kmedia.org/the-world-in-hdr-uhd-4k-demo/) and then selected Play With->PotPlayer. PotPlayer played the video with HDR applied. However, I noticed there is an "HDR" tag near the elapsed time display at the bottom left corner of the screen which was highlighted. There is also a "H/W" tag beside it which turns the Built-in DXVA video decoder on or off when clicked. (When the tags are on, they are highlighted.)
[Edit] I just tried VLC to see if HDR works and it does. However, HDR color in PotPlayer is eye-searing. HDR color in VLC is more natural, but still quite colorful. I wonder if one or both of these players is doing its own simulated HDR tone-mapping. I will try to find out. FWIW "The World In HDR 4K Demo.mkv" looked the same with PowerDVD Ultra and VLC, so PotPlayer's settings produce the problem I'm seeing.
I just remembered an old thread that I posted in https://forum.videohelp.com/threads/393534-Can-someone-guide-how-to-set-video-player-f...x265-HDR-films. There is another setting in PotPlayer that is needed for HDR, Preferences > video > surface format > 10-bit, also tick 10-bit output. Color in PotPlayer is still eye-searing (too much red) even so. I guess I need to try adjusting the color controls.
Last edited by usually_quiet; 16th Aug 2020 at 22:46. Reason: grammarIgnore list: hello_hello, tried, TechLord, Snoopy329
I haven't read all of this yet, just the first thing I saw... Intel 630... HDR can't be viewed off a discrete video card?
That's going to be problematic. My motherboards are almost entirely HEDT. As in no IGP.
I'll go through the rest shortly.I'm new here. Please be nice
Last edited by usually_quiet; 16th Aug 2020 at 22:36.Ignore list: hello_hello, tried, TechLord, Snoopy329
Yay; Cables finally arrived. Haven't tried them yet.
To recap; this is the 3 devices I'm trying to get HDR working on...
Note I do not have a hardware bluray player. Only 1 PC of 10 here have optical drives which I use for ripping my disks.
#1. X299 PC (No iGPU) 32GB 3600 16-16-16 RAM, 10940X CPU @ 4.4ghz, RTX 2080 Ti with HDMI 2.0b cable and a DP 1.4 cable. Monitors are both 32" Philips 328P6AUBREB. HDR movies look washed out, per the website linked below. With HDR turned on in Windows the movies are all but black and white.
#2. LG 55" 3D 4K TV 55UH950T with PC, GTX 1660 Ti, HDMI 2.0b cable, 16GB 3200 C16-18-18 RAM, X299 I7-7820K CPU @ 4.2ghz. no iGPU Exactly the same symptoms as #1
#3. Samsung Series 7 UA65MU7000 4K TV with PC. GTX 1660 Ti With HDMI 2.0b cable. I5-8600K CPU on Z370 motherboard, 16GB 3200 C16-18-18 RAM. Same Symptoms as #1
Note on conversion to SDR: While I was waiting on cables I did a few tests with converting to SDR. FFMpeg, as discussed on the linked website took 3 hours re-encode time per 1 hour of content on PC #1. DVDFab took 45 per 1 hour of content on the same files on the same PC.
As I have the only Optical drives, I rip all my content and store on my server, where the rest of the family can view it from. All LAN infrastructure is 10G (Cables, switches and most NICS - A couple of PCs only have 2.5G). All X299 systems have Either native 10G onboard or have a 10G NIC installed. So I don't think bandwidth is an issue.
With PC #1 - is my monitor the issue? With 8 bits+FRC? Do I need native 10 bit?
Do I need something like the 328P6VUBREB which has 600 nits brightness? https://www.philips.com.sg/c-p/328P6VUBREB_69/brilliance-lcd-monitor-with-usb-c-dock. Would rather not have to buy one as they are A$1000
Is potplayer the cause of my woes? I've enabled 10 bit and output 10 bit, but that didn't make the slightest difference. (can't download the sample file sorry, it's been blocked "too many downloads"). I dumped VLC because of quality and stuttering problems about 5 years ago.
Also had a read of https://wiki.mikejung.biz/PotPlayer. I don't have any CUDA options in the Video Decoder section.I'm new here. Please be nice
However, according to a different review that I found you may even be able to use the DisplayPort connection if you have a newer version of Windows 10. That other review says "The Brilliance 328P6AUBREB supports HDR operation on the HDMI and the DisplayPort inputs, and is compatible with input signals in the HDR-10 format. There is an option in the menu to turn HDR on or off. Note that, for versions of Windows 10 earlier than V1013, HDR is only supported for the HDMI input and then only for a resolution of 3,840 by 2,160."
I don't have a 4K HDR TV yet so I don't know how PC settings for those differs from PC settings for UHD HDR monitors or what settings to use for the TV itself.
Ignore list: hello_hello, tried, TechLord, Snoopy329
Mpv, mpc-be are alternatives to vlc, they can also do SDR tonemapping.
Part of the issue with HDR pass-through is that some of the devs for open-source video players don't have HDR monitors yet.
There also seems to be some issues with nvidia drivers with madVR users (I would suggest checking the doom9 thread, they tend to care about HDR).
HDR10 to SDR can of course be done at encoding. Do make sure your ffmpeg build supports 10bit x265 otherwise you will get 8 bit output which is undesirable. Your encode time differences are presumably because of the codec/encoding setting differences >> check output with mediainfo.
8bits+FRC monitor should be fine. At least 600nits brightness would be recommended for real HDR. However, this likely is unrelated to your issue.
I found a post here (was it one of yours?) that talked about setting options in Video>Pixel shader.
Enable SMPTE 2084 gives me what looks like normal color on HDR content. At least I can't tell the difference between a HDR with SMPTE 2084 enabled and a Non-HDR version.
The second SMPTE 2086 adds more color but the red is way too intense.
My TV's don't have Display port connections, so are the GTX 1660 Ti's good enough using HDMI?
Did I just find a workaround? I have I done something I shouldn't have? I gather all I'm doing here is "fudging it" not really solving the problem, but the upside is even though everyone here has dual 32" 2K monitors, setting this option "restores" the color for everyone.
@butterw I'd rather not spend months converting content to SDR. If I can rip it to disk and have it look natural and not washed out I'm happy. I'd don't have MadVR installed. I did play around with it trying to get 3D working (original source not SBS or TAB), but gave up. Which is a shame. I have about 80 3D movies.
Last edited by Yanta; 21st Sep 2020 at 04:14.I'm new here. Please be nice
Last edited by usually_quiet; 21st Sep 2020 at 22:43.Ignore list: hello_hello, tried, TechLord, Snoopy329