I'm a bit puzzled by the different GPU temperature readouts I'm seeing in Speedfan while playing SD H264/AAC MP4s in Potplayer.
Setup: Win 7 x64, i5 4440, Gigabyte H97-D3H, Zotac GT 240 - all at stock speeds;
nVidia drv 340.52, Potplayer 32bit 1.6.52515, auto selected renderer so Custom EVR (D3D9 Ex Copy)
Scenario 1: Video decoder is built-in FFmpeg, DVXA on, VLD for H264, copy-back mode is Auto. Window size is Video size 1x.
Observed: CPU usage for Potplayer hovers between 3 to 4 %. GPU usage hovers between 13 to 14 %. GPU temp is 43C steady.
Scenario 2: Video decoder is CUDA. Rest all same, except VLD parameter is greyed out.
Observed: CPU usage for Potplayer hovers between 2 to 5 %. GPU usage hovers between 6 to 7 %. GPU temp climbs and steadies at 52C!!
Isn't CUDA hardware part of the GPU? If so, if the overall GPU load is half, why is the temp in Scenario 2 nearly 10C higher?
+ Reply to Thread
Results 1 to 4 of 4
-
-
There could be many reasons.
Fact is that different instructions generate different amounts of heat.
Take for example the notorious AVX2 instructions in Intel CPUs.
-
How are you measuring GPU usage? Hardware decoding is a separate part of the GPU as far as I know.
I've tried various combinations of MPC-HC, MPC-BE, Potplayer, DXVA and CUVID and so far none of it's made much difference. GPU-Z usually reports the same thing for the same video, although for some reason MPC-HC with it's internal CUVID decoder shows about 5% more GPU load than MPC-HC with ffdshow's DXVA decoding (for the 720p video I was playing with) but the "video engine load" and temperature stay about the same. I don't really know why the increase in "GPU load" but for me that happened with MPC-HC but not for Potplayer when comparing DXVA and Cuvid.
If the video is running at a lower resolution the GPU load will drop compared to running fullscreen, but other than that I can't really explain your result. My old GPU seems to sit between 55 degrees and 60 degrees regardless of what it's doing.
Give GPU-Z a spin to see if it reveals anything new.
Last edited by hello_hello; 21st Feb 2015 at 19:40. Reason: spelling
-
It does. Although the Potplayer GPU usage and Speedfan GPU temp readings match with GPU-Z, the latter shows three other attributes which, I believe, explain the behaviour here. During CPU decoding, the GPU Core, Memory and Shader clocks are 405, 324 and 810 Mhz respectively, whereas during CUDA decoding, they are 549, 789 and 1339 Mhz. I think that about covers it.
Thanks.Last edited by Gyan; 16th Feb 2015 at 04:58.