VideoHelp Forum
+ Reply to Thread
Results 1 to 10 of 10
Thread
  1. Hi everyone, I don't know a huge amount about video tech and playback, so please explain like I'm 5.
    I can see there are a lot of posts regarding issues with 4k/UHD playback already, so I've had a good read and spent a few nights Googling to find an answer, but nothing has helped so far.

    My issue is; every 2160p Blu-ray rip, whether 80GB HEVC Remux or 10GB compressed, looks the same quality, and that quality isn't great.
    I've played two different copies of the same movie side by side; one copy being 1080p in .mkv (5GB size, 5000kbps bitrate) and the other 2160p HEVC Remux in .mkv (80GB+ size, 65,000kbps), yet both look pretty much the same. A 1080p YouTube video looks just as sharp and clear.
    I've had this issue for a while now so I've tried a few suggestions over recent months with different settings to tweak, VLC and MPC-HC, Windows 7 and Windows 10 (both x64), .mkv and other formats, but still, they don't look that good and they certainly don't look as good as the quality of an actual 1080p Blu-ray disc playback.

    This is a screenshot of a movie: 2160p HEVC Remux BDrip, 75GB file size, 65,000kbps bitrate - https://forum.videohelp.com/images/imgfiles/Tsrxdcd.jpg
    I had another copy of the same movie in much lower quality as my son's laptop is a bit old now and struggles with high bitrates. His copy was around 8GB, 1080p, 7000kbps and I could barely tell the difference between them. I don't have a screenshot as he watched it and deleted it.
    During playback I've noticed that on both VLC and MPC my CPU is pretty much pinned at 100% usage, while my GPU/s are unused, literally 0-3% usage and at 200MHz idle clock speeds.

    System specs:
    i7 3770k @5GHz
    32GB RAM
    GTX 980Ti's (2-way SLI) - I've tried disabling one in case it was an SLI issue but my GPUs have 0% load.
    Windows 10 (also tried Win7) x64
    Everything on SSDs, no internal HDDs
    VLC and MPC-HC both at default settings.
    All drivers and players up to date and no 3rd party codex packs installed

    As said, my knowledge isn't great, but shouldn't the players be using my GPU/s mostly instead of CPU, could this be the reason or am I way off??
    Any help or advice would be greatly appreciated.
    Thank you
    Last edited by Deep Euphoria; 25th Jan 2018 at 11:04.
    Quote Quote  
  2. Member Bernix's Avatar
    Join Date
    Apr 2016
    Location
    Europe
    Search Comp PM
    Just quick question, are you using CUVID or what video decoder are you using?
    https://wiki.mikejung.biz/MPC-HC_Video_Decoder_Comparison
    Here is comparison of decoders for MPC-HC, and if you want utilize your GPU I suggesting you try CUVID decoder.

    Bernix
    Last edited by Bernix; 25th Jan 2018 at 11:07. Reason: typo
    Quote Quote  
  3. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    4K HEVC is very common. To set up MPC-HC to use CUVID for HEVC decoding:
    Go to Options>Internal Filters >Internal LAV Filters Settings>Video Decoder
    For "Hardware Decoder to Use", select NVIDIA CUVID.
    Under "Codecs for Hardware", make sure HEVC box is checked.

    It appears VLC can't decode HEVC unless libde265 is installed, but I haven't done that. I don't think libde265 uses the GPU for decoding.
    Last edited by usually_quiet; 25th Jan 2018 at 12:23.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  4. Thanks for the fast response.

    @Bernix - I wasn't using CUVID, it was set to none. I've now changed it to CUVID and it looks a little better, but still, my CPU maxed at 100% (or close to) and 0% GPU/s usage.

    @usually_quiet - As said above, I had previously reinstalled MPC-HC so it was set to none. I've now changed it to CUVID and it looks a little better, but still, my CPU maxed at 100% (or close to) and 0% GPU/s usage.

    Should the GPUs be seeing some utilisation now with this change?? Also, sorry if this is a noobie question; should a UHD Remux, completely untouched just ripped, look just as good as playing it directly from a UHD Blu-ray disc?? Or is there always some quality loss.
    Quote Quote  
  5. Member
    Join Date
    Mar 2008
    Location
    United States
    Search Comp PM
    See this thread, it talks about the GPU revision being a factor ...
    https://hardforum.com/threads/4k-hevc-playback.1924849/
    Also
    https://forums.adobe.com/thread/2018024
    Quote Quote  
  6. Member Bernix's Avatar
    Join Date
    Apr 2016
    Location
    Europe
    Search Comp PM
    I have also read, that only dedicated part of your GPU is used. Not whole GPU so usage can be misleading.
    If you notice some improvement, so some changes was done. You can also try Potplayer and set it here, but check also task manager, it should also shows you how is "GPU" utilized.
    Bernix
    Quote Quote  
  7. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by Deep Euphoria View Post
    Thanks for the fast response.

    @Bernix - I wasn't using CUVID, it was set to none. I've now changed it to CUVID and it looks a little better, but still, my CPU maxed at 100% (or close to) and 0% GPU/s usage.

    @usually_quiet - As said above, I had previously reinstalled MPC-HC so it was set to none. I've now changed it to CUVID and it looks a little better, but still, my CPU maxed at 100% (or close to) and 0% GPU/s usage.

    Should the GPUs be seeing some utilisation now with this change?? Also, sorry if this is a noobie question; should a UHD Remux, completely untouched just ripped, look just as good as playing it directly from a UHD Blu-ray disc?? Or is there always some quality loss.
    CPU usage points to the GPU not doing full hardware decoding. I have an i5 4570S with Intel HD Graphics 4600, which provides hybrid hardware/software decoding for 8-bit HEVC. 10-bit HEVC is decoded using software and this CPU also struggles at 100% with 10-bit HEVC at UHD resolutions. On the other hand CPU usage for UHD 8-bit H.264 video, which the iGPU can fully decode, never goes above 36%.

    UHD Blu-ray typically uses 10-bit HEVC. After I saw davexnet's post, I looked up the GTX 980 Ti, and it isn't able to fully decode 10-bit HEVC using only hardware because it's a GM200 card. GM206 is the revision which supports full fixed function HEVC hardware decoding. So mystery solved. You need a different NVIDIA card for full HEVC hardware decoding.
    Last edited by usually_quiet; 25th Jan 2018 at 14:06.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  8. Member Bernix's Avatar
    Join Date
    Apr 2016
    Location
    Europe
    Search Comp PM
    Hi usually_quiet,
    just one thing that I don't understand. https://en.wikipedia.org/wiki/GeForce_900_series
    980ti is released 1/2 of year later and has not features that have 206 released 1/2 years before. Also older cards are named GM206 and newer 980ti is just GM200. So they go unusual way and named from 206 to 200 from weak card with Hevc support to powerful cards with 0 support. But apparently some change happened even on 980ti, as OP reported after using Cuvid in h265, which is according to tables and wiki nonsense. Video shouldn't be played at all with this decoder.

    Bernix
    Quote Quote  
  9. Haha, OK now I'm lost. So, my cards don't support HEVC 10-bit... but then the video plays so they must support it?

    Maybe I'm confusing things, but from reading the past few posts, is it that my CPU with onboard iGPU is doing the work instead of my GPUs as they don't support HEVC 10-bit decoding, hence the lower than normal image quality and maxed out CPU with 0% GPU usage. Is that what you're saying??
    Quote Quote  
  10. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by Deep Euphoria View Post
    Haha, OK now I'm lost. So, my cards don't support HEVC 10-bit... but then the video plays so they must support it?
    HEVC decoding is done by the CPU using software if the video card's GPU isn't able to decode it. The video still plays even if the CPU is using software to decode, but the video stutters or pixelates because the i7 3770k can't decode quickly enough to keep up using a software decoder.

    Originally Posted by Deep Euphoria View Post
    Maybe I'm confusing things, but from reading the past few posts, is it that my CPU with onboard iGPU is doing the work instead of my GPUs as they don't support HEVC 10-bit decoding, hence the lower than normal image quality and maxed out CPU with 0% GPU usage. Is that what you're saying??
    No. I'm definitely not saying the iGPU is doing the decoding in your case. The i7 3770k's iGPU can't decode HEVC at all. If I remember correctly the GTX 980 Ti can do hybrid decoding for 8-bit HEVC, like my Haswell processor's iGPU does, but can't decode 10-bit HEVC. 10-bit HEVC decoding is done entirely by the CPU using a software decoder, and that's why the i7 3770k's CPU is at 100%

    [Edit]My memory played a trick on me. I decided to look at Wikipedia's Nvidia PureVideo entry and it says the GTX 980 Ti uses a combination of the PureVideo hardware and software running on the shader array (not the CPU) to decode HEVC (H.265) as partial/hybrid hardware video decoding. The Wikipedia entry doesn't list restrictions on the type of HEVC encoding parameters the hybrid decoding supports or the maximum resolution allowed. This hybrid decoding approach is not as good as full hardware decoding, so I still think that if someone wants to decode 10-bit HEVC video from UHD Blu-ray rips, they need to use a GPU which does full hardware-accelerated HEVC decoding.

    I found this thread on VLC's forum discussing HEVC hardware decoding and settings. There is no mention of libde265.: https://forum.videolan.org/viewtopic.php?t=137743
    Last edited by usually_quiet; 25th Jan 2018 at 22:51.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!