I am running Windows 7 Media Centre on a 32" LG HD Tv,
I have been trying for a while now to get smooth HD playback but with out prevail.
I am using all the right codex and driver and all that jazz, and the playback does look good, but you can still tell it not quite right, if you look very closely you can still see it doing little jumps.
CPU usage while playing back is only 20 / 30 %, I've recently purchased a GT240 nvidia GFX card, and using the nvidia system monitoring tools it says that it is only using 18 / 20% GPU usage.
PC is has an 2.8Ghz E6300 dualcore CPU, and 4gb of DDR3 memory.
Am i right in saying that this should have no problem playing back smooth full HD?
I'm convinced my problem still lies with the GFX card, the GT240 only has a core speed of 550Mhz, which is low compared to other cards. Could this be my problem?
Should I be looks for a card with a very hi core GPU speed, or should 550Mhz be enough?
Any thoughts would be great.
+ Reply to Thread
Results 1 to 12 of 12
Thanks for you quick reply,
Seems the same for blu-ray disks and blu-ray rips.
For the rips I'm using Handbrake, to encode H.264. File sizes come out around 8 / 10 gb.
I've got the coreAVC codec installed.
Don't get me wrong the playback is good, its just not perfect. Is it possible to get perfect play back from a PC?
Do higher quality GFX cards actually make any difference in this situation?
CoreAVC Pro or the regular version? You have to have the Pro version to get smooth playback of 1080p h.264.
Some things to keep in mind: 24 fps isn't very smooth under the best of circumstances. 24 fps to a 60 Hz monitor will display 3:2 frame repeat judder. Download the small 60 fps video in the following post and note the difference between 60, 30 and 24 fps:
Last edited by jagabo; 14th Nov 2010 at 19:00.
What kind of connections are you using, HDMI, component or wireless?I love children, girl children... about 16-40
Another thing to note: WMV codecs reduce the frame rate if the action/noise gets too high for the selected bitrate. For example, a 24 fps video will drop down to 12 fps during high action/noise sequences. This causes obvious jerkiness during those sequences.
Thanks for the info, Yes that makes a lot of sense regarding the frame rates.
In answers to your questions:
Yes I am using the pro version of the CoreAVC,
I am connecting via a HDMI cable.
Its interesting what you say about the frame rate dropping for hi action scenes, this is actually the problem I am experiencing, when there is a lot of camera movement, or where there is a lot stuff moving on the screen, is when the problem is at its worse.
But what is causing the frame rate to drop? Not enough hardware? or is it the encoding?
I'm going to try ripping a blueray at 60 FPS, to see if that makes any improvement.
As far as I know, only WMV uses the reduced frame rate technique while encoding. If you are seeing it with h.264 I think you have a playback problem. It is the high action/noise scenes that require the highest bitrate and the most CPU/GPU for decoding. Have you tried CoreAVC set to both CPU and GPU decoding?
Last edited by jagabo; 15th Nov 2010 at 18:30.
Just an update on my progress.
Just to easy my curiosity I borrowed a GTS 250 from a friend. Straight away I noticed that the quality of the image was a lot better with a lot more colour depth, and a sharper image (which I was not expecting).
But true enough as you guys have been saying the jerkyness was still there.
I then tried re-encoding a video to 30fps, which seemed to make things worse.
I set my graphic card refresh rate to 50hz instead of 60, which didn't make any difference.
But then I set the graphic card refresh rate to 24hz (same frame rate as the video), and as if by magic the problem was gone
Bit annoying now, so I'll have to check the frame rate of every video before playing , but at least its now nice and smooth.
My tv is 100hz, I'm in the uk. Just been reading up on 5:5 pulldown, very interesting stuff, I will have a play with some of the tv settings to see what happeneds.
For example, 24 fps material output at 60 fps would use 3:2 frame repeats. The TV would then have to recognize the repeat pattern to understand that the original material was 24 fps, throw out the repeats, then repeat the remaining 24 fps frames with 4:4 frame repeats or motion interpolation. Rather than doing that it was probably throwing away 1 frame out of every 6 to reduce the frame rate from 60 fps to 50 fps, then displaying each of the 50 fps frames twice.
Last edited by jagabo; 21st Nov 2010 at 06:54.