I have LG 38GL950G-B monitor which has 2 modes:
120Hz + HDR + 10-bit (8-bit+FRC)
160Hz + HDR + 8-bit
I mainly use second one and I don't like changing it every time in Nvidia Control Panel.
So there is my question:
If I use MadVR and set in it's settings that my monitor is 10-bit, but I use it in 8-bit mode - will it degrade image quality on 8-bit video materials?
For 10-bit materials I will set everything to 10-bit, but I wonder if this setting could be left alone in MadVR or will it degrate 8-bit SDR materials quality when my monitor is in 8-bit mode.
+ Reply to Thread
Results 1 to 1 of 1