VideoHelp Forum
+ Reply to Thread
Results 1 to 16 of 16
Thread
  1. Member Yanta's Avatar
    Join Date
    Sep 2011
    Location
    Australia
    Search Comp PM
    I have two HDR400 monitors - Philip 328P6AUBREB. I thought I'd connected them both with Display Port 1.4 cables. I have a PS3 connected which I rarely use, and somehow I'd mixed up the cables and one of the monitors is DP and the other HDMI.

    I tried to enable HDR on windows and it would only enable on one monitor, which is how I found out the monitor was connected via HDMI. So when both are connected via DP, I can't enable HDR. I thought DP cables supported HDR. I guess not, unless I'm doing something wrong.

    The content looks the same on both monitors with or with HDR on or off. Faded, washed out, dull blueish grey.

    I'm trying some HDR10 content MKV files for testing purposes.

    The graphics card is an MSI RTX 2080 Ti Gaming X *Latest Driver). Both Monitors are 2K HDR400 32" IPS panels. Is my hardware the problem, or is it Windows configuration or something else?

    Thanks
    10940x with Creator X299 Motherboard, 32GB DRR4-3733, RTX 3080 Ti GPU
    Quote Quote  
  2. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    My HDR-capable monitor has no DP connections and my PC has no HDMI 2.0 ports, so the monitor is connected to my PC's DisplayPort 1.2a connection using a Monoprice DisplayPort 1.2a to 4K at 60Hz HDMI Active HDR Adapter #33125. If DisplayPort 1.2a supports HDR then DisplayPort 1.4 does too.

    However, the review for the Philips 328P6AUBREB that I read says that particular monitor only supports HDR 10 when using its HDMI connection.

    [Edit]After further investigation, it appears that DisplayPort 1.4 is the first DP version with the ability to provide HDR10 metadata defined in CTA-861.3, including static and dynamic metadata and the Rec. 2020 color space, for HDMI interoperability.

    However, my computer's DP 1.2a connection can supply HDR10 metadata because Intel added the ability to provide HDR10 metadata to its implementation of DP 1.2a for Kaby Lake and Coffee Lake Core i processors. This change allowed the HDMI 2.0a connections that are available on some motherboards for these processor families to work correctly because the HDMI 2.0a connections are provided via an LSPCON using the DP path.
    Last edited by usually_quiet; 15th Aug 2020 at 16:29. Reason: removed unintentional link to software
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  3. Member Yanta's Avatar
    Join Date
    Sep 2011
    Location
    Australia
    Search Comp PM
    Thanks for the reply.

    I have an MSI Creator X299 motherboard with a 10940X CPU. The Graphics card is the MSI RTX 2080 Ti Gaming X.

    Ok, based on the review that would explain why HDR will only enable when I select the monitor that is connected via HDMI. The fact that DP 1.4 might support it is irrelevant to me because of limitations built into the monitor. Have I got that right?

    So I have to get rid of Display Port and only use HDMI? I can live with that. I'm getting sick of DP moving my open windows to one screen after the monitors goes to sleep, and I don't use the PS3 that much anyway.

    But back to image quality; even on the HDMI connected monitor with HDR turned on or off on Windows the image looks the same - dull, washed out, faded blueish-grey content.

    A lot of information, which I must admit went over the top. For example, I don't know what LSOCON is, but I shall now go and research it.
    10940x with Creator X299 Motherboard, 32GB DRR4-3733, RTX 3080 Ti GPU
    Quote Quote  
  4. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by Yanta View Post
    Ok, based on the review that would explain why HDR will only enable when I select the monitor that is connected via HDMI. The fact that DP 1.4 might support it is irrelevant to me because of limitations built into the monitor. Have I got that right?

    So I have to get rid of Display Port and only use HDMI?...
    Yes, that's right. The limitation is built into your monitors. Using their HDMI 2.0 connection is the only way to get HDR10. If you only had DP 1.4 available from your graphics card you could buy an active DisplayPort 1.4 to HDMI 2.0b HDR adapter to get HDMI. An example: https://www.amazon.com.au/CAC-1080-DisplayPort-Adapter-displays-4096x2160/dp/B077JB28KM/

    Originally Posted by Yanta View Post
    But back to image quality; even on the HDMI connected monitor with HDR turned on or off on Windows the image looks the same - dull, washed out, faded blueish-grey content.
    I can't recall the exact procedure I used to get HDR working. I will have to try to figure that out in the morning, USA time not Oz time.

    Originally Posted by Yanta View Post
    For example, I don't know what LSOCON is, but I shall now go and research it.
    LSPCON = level shifter and protocol converter chip. An active DP to HDMI adapter acts like an external LSPCON.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  5. Member Yanta's Avatar
    Join Date
    Sep 2011
    Location
    Australia
    Search Comp PM
    Thanks again.

    I tracked down a whitepaper on HDR from Intel. It's a bit dated, and it is referring to the IGP but it explains LSPCON...
    https://www.intel.com/content/dam/support/us/en/documents/graphics/HDR_Intel_Graphics_...WhitePaper.pdf

    My graphics card only has 1 HDMI 2.0b. Looks like I'll have to buy an adapter.
    https://www.msi.com/Graphics-card/GeForce-RTX-2080-Ti-GAMING-X-TRIO/Specification

    Just placed an order for 2 adapters. Sheesh, Amazon must not want to sell to people. I had to enter my password, then a capture then a one time password. Oh well, all done. Amazingly, Amazon were the cheapest for this item. Normally Amazon Au are much, much more expensive than everyone else. I should receive the adapters in a month or so.

    In the mean time, if you can recall how you got it working, I have one monitor I can play around with, so I'd be really appreciative for any guidance.
    10940x with Creator X299 Motherboard, 32GB DRR4-3733, RTX 3080 Ti GPU
    Quote Quote  
  6. With regard to software config for HDR pass-thru:
    the video player used (try VLC ?), windows HDR setting, gpu driver issues all come into play.
    Quote Quote  
  7. Member Yanta's Avatar
    Join Date
    Sep 2011
    Location
    Australia
    Search Comp PM
    Originally Posted by butterw View Post
    With regard to software config for HDR pass-thru:
    the video player used (try VLC ?), windows HDR setting, gpu driver issues all come into play.
    I don't use VLC. It's pretty bad. Had more problems with VLC than I've had hot dinners (Codecs, freezing, stuttering, playback of bluray, insanely long periods between updates and fixes, audio lagging etc). Had no issues once I switched to PotPlayer, except 3D. But VLC couldn't do 3D either (the last time I tried, which was a few years ago).

    Anyway, I get that all of those "come into play". Do you have some tips on how to configure things so it works?

    GPU drivers are latest. Other than the HDR settings in the settings app, are there other tweaks you can suggest at that might help?
    10940x with Creator X299 Motherboard, 32GB DRR4-3733, RTX 3080 Ti GPU
    Quote Quote  
  8. VLC is not my preferred player either, but recent versions do work a lot better than you seem to believe.
    You can install multiple video players without issues, in particular if you go for portable versions.

    Maybe this is relevant ?:
    https://forum.videohelp.com/threads/397040-PotPlayer-Playing-HDR-on-TV-from-PC-via-HDMI


    If there is a good guide on PC HDR pass-thru, someone will post a link to it.
    Quote Quote  
  9. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    First of all, my monitor and Intel 630 graphics can only work with HDR-10. Other kinds of HDR don't work at all with my system. Also, my Dell monitor isn't a real 4K 10-bit display or even IPS. It is a 1920x1080 8-bit panel (actually 6 bits+FRC dithering) which can accept 3160p video at 59.97 fps, downscale that to 1920x1080 and perform tone mapping internally to approximate real HDR. It isn't an ideal set-up but it does let me watch commercial UHD Blu-ray discs using Power DVD Ultra 18.

    I tried to figure out what I did to enable HDR and this is all I can come up with.

    My monitor has a Movie HDR profile, which I selected to put the monitor in UHD HDR mode. Windows 10's Settings->Display->Resolution is set to 3840x2160. In Windows 10's Settings->Display->Windows HD Color settings, I have "Play HDR Games and apps" turned on and I have "Stream HDR Video" turned on. I have PotPlayer 200730(1.7.21280) installed, which I set up to allow the use of hardware decoding and encoding.

    I right-clicked "The World In HDR 4K Demo.mkv" (downloaded from https://4kmedia.org/the-world-in-hdr-uhd-4k-demo/) and then selected Play With->PotPlayer. PotPlayer played the video with HDR applied. However, I noticed there is an "HDR" tag near the elapsed time display at the bottom left corner of the screen which was highlighted. There is also a "H/W" tag beside it which turns the Built-in DXVA video decoder on or off when clicked. (When the tags are on, they are highlighted.)

    [Edit] I just tried VLC to see if HDR works and it does. However, HDR color in PotPlayer is eye-searing. HDR color in VLC is more natural, but still quite colorful. I wonder if one or both of these players is doing its own simulated HDR tone-mapping. I will try to find out. FWIW "The World In HDR 4K Demo.mkv" looked the same with PowerDVD Ultra and VLC, so PotPlayer's settings produce the problem I'm seeing.

    I just remembered an old thread that I posted in https://forum.videohelp.com/threads/393534-Can-someone-guide-how-to-set-video-player-f...x265-HDR-films. There is another setting in PotPlayer that is needed for HDR, Preferences > video > surface format > 10-bit, also tick 10-bit output. Color in PotPlayer is still eye-searing (too much red) even so. I guess I need to try adjusting the color controls.
    Last edited by usually_quiet; 16th Aug 2020 at 22:46. Reason: grammar
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  10. Member Yanta's Avatar
    Join Date
    Sep 2011
    Location
    Australia
    Search Comp PM
    I haven't read all of this yet, just the first thing I saw... Intel 630... HDR can't be viewed off a discrete video card?
    That's going to be problematic. My motherboards are almost entirely HEDT. As in no IGP.

    I'll go through the rest shortly.
    10940x with Creator X299 Motherboard, 32GB DRR4-3733, RTX 3080 Ti GPU
    Quote Quote  
  11. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by Yanta View Post
    I haven't read all of this yet, just the first thing I saw... Intel 630... HDR can't be viewed off a discrete video card?
    That's going to be problematic. My motherboards are almost entirely HEDT. As in no IGP.
    As far as I know, you can use a discrete video card for viewing UHD/HDR video files or UHD/HDR video from streaming services. You don't need a Kaby Lake (or later) Intel Core i processor's iGPU unless, like me, you want to watch a commercially produced UHD Blu-ray disc using your computer.
    Last edited by usually_quiet; 16th Aug 2020 at 22:36.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  12. Member Yanta's Avatar
    Join Date
    Sep 2011
    Location
    Australia
    Search Comp PM
    Yay; Cables finally arrived. Haven't tried them yet.

    To recap; this is the 3 devices I'm trying to get HDR working on...
    Note I do not have a hardware bluray player. Only 1 PC of 10 here have optical drives which I use for ripping my disks.

    #1. X299 PC (No iGPU) 32GB 3600 16-16-16 RAM, 10940X CPU @ 4.4ghz, RTX 2080 Ti with HDMI 2.0b cable and a DP 1.4 cable. Monitors are both 32" Philips 328P6AUBREB. HDR movies look washed out, per the website linked below. With HDR turned on in Windows the movies are all but black and white.
    https://www.maxvergelli.com/how-to-convert-hdr10-videos-to-sdr-for-non-hdr-devices/

    #2. LG 55" 3D 4K TV 55UH950T with PC, GTX 1660 Ti, HDMI 2.0b cable, 16GB 3200 C16-18-18 RAM, X299 I7-7820K CPU @ 4.2ghz. no iGPU Exactly the same symptoms as #1

    #3. Samsung Series 7 UA65MU7000 4K TV with PC. GTX 1660 Ti With HDMI 2.0b cable. I5-8600K CPU on Z370 motherboard, 16GB 3200 C16-18-18 RAM. Same Symptoms as #1

    Note on conversion to SDR: While I was waiting on cables I did a few tests with converting to SDR. FFMpeg, as discussed on the linked website took 3 hours re-encode time per 1 hour of content on PC #1. DVDFab took 45 per 1 hour of content on the same files on the same PC.

    As I have the only Optical drives, I rip all my content and store on my server, where the rest of the family can view it from. All LAN infrastructure is 10G (Cables, switches and most NICS - A couple of PCs only have 2.5G). All X299 systems have Either native 10G onboard or have a 10G NIC installed. So I don't think bandwidth is an issue.

    With PC #1 - is my monitor the issue? With 8 bits+FRC? Do I need native 10 bit?
    Do I need something like the 328P6VUBREB which has 600 nits brightness? https://www.philips.com.sg/c-p/328P6VUBREB_69/brilliance-lcd-monitor-with-usb-c-dock. Would rather not have to buy one as they are A$1000

    Is potplayer the cause of my woes? I've enabled 10 bit and output 10 bit, but that didn't make the slightest difference. (can't download the sample file sorry, it's been blocked "too many downloads"). I dumped VLC because of quality and stuttering problems about 5 years ago.

    Also had a read of https://wiki.mikejung.biz/PotPlayer. I don't have any CUDA options in the Video Decoder section.
    10940x with Creator X299 Motherboard, 32GB DRR4-3733, RTX 3080 Ti GPU
    Quote Quote  
  13. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by Yanta View Post
    To recap; this is the 3 devices I'm trying to get HDR working on...
    Note I do not have a hardware bluray player. Only 1 PC of 10 here have optical drives which I use for ripping my disks.

    #1. X299 PC (No iGPU) 32GB 3600 16-16-16 RAM, 10940X CPU @ 4.4ghz, RTX 2080 Ti with HDMI 2.0b cable and a DP 1.4 cable. Monitors are both 32" Philips 328P6AUBREB. HDR movies look washed out, per the website linked below. With HDR turned on in Windows the movies are all but black and white.
    https://www.maxvergelli.com/how-to-convert-hdr10-videos-to-sdr-for-non-hdr-devices/

    #2. LG 55" 3D 4K TV 55UH950T with PC, GTX 1660 Ti, HDMI 2.0b cable, 16GB 3200 C16-18-18 RAM, X299 I7-7820K CPU @ 4.2ghz. no iGPU Exactly the same symptoms as #1

    #3. Samsung Series 7 UA65MU7000 4K TV with PC. GTX 1660 Ti With HDMI 2.0b cable. I5-8600K CPU on Z370 motherboard, 16GB 3200 C16-18-18 RAM. Same Symptoms as #1
    Originally Posted by Yanta View Post
    With PC #1 - is my monitor the issue? With 8 bits+FRC? Do I need native 10 bit?
    Do I need something like the 328P6VUBREB which has 600 nits brightness? https://www.philips.com.sg/c-p/328P6VUBREB_69/brilliance-lcd-monitor-with-usb-c-dock. Would rather not have to buy one as they are A$1000
    The 328P6AUBREB only supports HDR-10 but other than that, as long as you are using the monitor's HDMI 2.0 connection, resolution is set to 3,840 by 2,160, and the monitor is set to its HDR mode, the issue should not be with the monitor itself. Your monitor may only have an 8bit+FRC panel but it uses an internal 12-bit LUT to take care of tone-mapping HDR-10 internally.

    However, according to a different review that I found you may even be able to use the DisplayPort connection if you have a newer version of Windows 10. That other review says "The Brilliance 328P6AUBREB supports HDR operation on the HDMI and the DisplayPort inputs, and is compatible with input signals in the HDR-10 format. There is an option in the menu to turn HDR on or off. Note that, for versions of Windows 10 earlier than V1013, HDR is only supported for the HDMI input and then only for a resolution of 3,840 by 2,160."

    I don't have a 4K HDR TV yet so I don't know how PC settings for those differs from PC settings for UHD HDR monitors or what settings to use for the TV itself.

    Originally Posted by Yanta View Post
    Is potplayer the cause of my woes? I've enabled 10 bit and output 10 bit, but that didn't make the slightest difference. (can't download the sample file sorry, it's been blocked "too many downloads"). I dumped VLC because of quality and stuttering problems about 5 years ago.

    Also had a read of https://wiki.mikejung.biz/PotPlayer. I don't have any CUDA options in the Video Decoder section.
    I've dumped PotPlayer because of the frustrations I had working with it for HDR and it has been several years since I owned a PC with a discrete video card installed. I'm afraid that I'm not going to be much help to you with this.

    Originally Posted by Yanta View Post
    Note on conversion to SDR: While I was waiting on cables I did a few tests with converting to SDR. FFMpeg, as discussed on the linked website took 3 hours re-encode time per 1 hour of content on PC #1. DVDFab took 45 per 1 hour of content on the same files on the same PC.
    I haven't done any SDR conversions so I'm not going to be any help there either.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  14. Mpv, mpc-be are alternatives to vlc, they can also do SDR tonemapping.

    Part of the issue with HDR pass-through is that some of the devs for open-source video players don't have HDR monitors yet.
    There also seems to be some issues with nvidia drivers with madVR users (I would suggest checking the doom9 thread, they tend to care about HDR).

    HDR10 to SDR can of course be done at encoding. Do make sure your ffmpeg build supports 10bit x265 otherwise you will get 8 bit output which is undesirable. Your encode time differences are presumably because of the codec/encoding setting differences >> check output with mediainfo.

    8bits+FRC monitor should be fine. At least 600nits brightness would be recommended for real HDR. However, this likely is unrelated to your issue.
    Quote Quote  
  15. Member Yanta's Avatar
    Join Date
    Sep 2011
    Location
    Australia
    Search Comp PM
    Originally Posted by usually_quiet View Post
    Note that, for versions of Windows 10 earlier than V1013, HDR is only supported for the HDMI input and then only for a resolution of 3,840 by 2,160."
    What is Windows version 1013? I'm on 1809 LTSC 17763.1397

    I found a post here (was it one of yours?) that talked about setting options in Video>Pixel shader.

    Enable SMPTE 2084 gives me what looks like normal color on HDR content. At least I can't tell the difference between a HDR with SMPTE 2084 enabled and a Non-HDR version.

    The second SMPTE 2086 adds more color but the red is way too intense.

    My TV's don't have Display port connections, so are the GTX 1660 Ti's good enough using HDMI?

    Did I just find a workaround? I have I done something I shouldn't have? I gather all I'm doing here is "fudging it" not really solving the problem, but the upside is even though everyone here has dual 32" 2K monitors, setting this option "restores" the color for everyone.

    @butterw I'd rather not spend months converting content to SDR. If I can rip it to disk and have it look natural and not washed out I'm happy. I'd don't have MadVR installed. I did play around with it trying to get 3D working (original source not SBS or TAB), but gave up. Which is a shame. I have about 80 3D movies.
    Last edited by Yanta; 21st Sep 2020 at 04:14.
    10940x with Creator X299 Motherboard, 32GB DRR4-3733, RTX 3080 Ti GPU
    Quote Quote  
  16. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by Yanta View Post
    Originally Posted by usually_quiet View Post
    Note that, for versions of Windows 10 earlier than V1013, HDR is only supported for the HDMI input and then only for a resolution of 3,840 by 2,160."
    What is Windows version 1013? I'm on 1809 LTSC 17763.1397
    I cut that quote from the article that I linked to and pasted it into my post without looking up V1013. I trusted that the author of the article referenced a real build but the search engine I used to try to find out about it just now doesn't know what it is. The closest thing the search engine found was a Windows Insider Windows 10 build, 10130. Maybe it is a typo.

    Originally Posted by Yanta View Post
    I found a post here (was it one of yours?) that talked about setting options in Video>Pixel shader.

    Enable SMPTE 2084 gives me what looks like normal color on HDR content. At least I can't tell the difference between a HDR with SMPTE 2084 enabled and a Non-HDR version.

    The second SMPTE 2086 adds more color but the red is way too intense.
    Yes. I did a search and found I did suggest that in this post in another thread but I got that advice from someone else's post that I haven't found yet. Now I know why I might have too much red in HDR video when I used PotPlayer. Maybe I can fix it now.

    Originally Posted by Yanta View Post
    My TV's don't have Display port connections, so are the GTX 1660 Ti's good enough using HDMI?
    The GTX 1660 Ti has HDMI 2.0b in addition to DisplayPort 1.4a so it should be OK.

    Originally Posted by Yanta View Post
    Did I just find a workaround? I have I done something I shouldn't have? I gather all I'm doing here is "fudging it" not really solving the problem, but the upside is even though everyone here has dual 32" 2K monitors, setting this option "restores" the color for everyone.
    So what if you are fudging it for now. Perfection can be achieved later when the dust has settled.
    Last edited by usually_quiet; 21st Sep 2020 at 22:43.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!