Hey guys,
After a long time i decided to fork out and pay for a lovely new HD monitor.
I saw the results of HD movies on a friends computer and wanted the same. After receiving the monitor I only used VGA and had crappy results. but today my new DVI-HDMI cable came in the post and i hooked up my monitor via HDMI for the first time.
I'm not too impressed however - I played a BD rip of Lord of of the rings that i have;
Video: MPEG4 Video (H264) 1920x800 23.976fps 2102kbps
using media player classic with the mega K-lite codec pack installed. i went of forums and decided to download the DivX h264 codecs but it looks the same.
What i see: The film looks blotchy - its hard to say but during some scenes of motion the image can be a bit blocthy. i was expecting crystal clear video from a movie with such a high bitrate!!!! Now the movie is totally watchable but not as impressive as i would like.
What can i do? any settings i can change?
Help!!!!![]()
+ Reply to Thread
Results 1 to 15 of 15
-
-
-
A "Rip" is an exact copy, no matter how many people misuse the term.
What you have is a dreadful re-encode. No one who halfway knows what he's doing would use such a low bitrate.
What can you do? Play something decent to test your monitor.
Just telling you the truth. Good luck and welcome to the forum.Pull! Bang! Darn! -
ok cheers thanks
where can i get "something decent" to test my monitor?
if i do a screenshot of the "decent" video will that help?
Cheers -
ok i just went on to DEMO WORLD where you can get full HD 1080p demo's i downloaded a BBC one, here's a screenshot.
it MUST be some settings in my media player classic - any ideas? -
ok i get what your saying!!
I just dont realise why I have seen better quality on another computer and less quality on mine for the same vid!!!
apart from bitrate - is there a way to maximise the visual quality of a monitor/graphics card??????
Thanks -
I assume your graphics card IS set for 1920*1080 output (you mentioned VGA before)
-
-
I only asked since it is so easy to over-look the obvious. Your old monitor probably had a different resolution and when you plugged in the new one you might have expected a miracle. You certainly did by plugging a VGA cable in to it.
Just because you have a HD card and a hdmi cable and a HD monitor all must be talking the same language.
So double-check the output settings on the card. Does the monitor require any drivers that you have not installed/setup ? -
For full HD, 2102kb/s is NOT "such a high bitrate". "Such a high bitrate" would be between 15x to 20x that (30mb/s to 40mb/s).
I fully expect ANY h264 video 1920x1080 at 2mb/s to look crappy, blotchy, pixelated, etc. on ANY monitor connected to ANY VGA card with ANY HDMI or component cable.For the nth time, with the possible exception of certain Intel processors, I don't have/ever owned anything whose name starts with "i". -
Have you calibrated your new monitor at all or are you still using the default settings?
-
Want my help? Ask here! (not via PM!)
FAQs: Best Blank Discs • Best TBCs • Best VCRs for capture • Restore VHS
Similar Threads
-
New 007 Skyfall trailer out - impressive
By yoda313 in forum Off topicReplies: 0Last Post: 1st Aug 2012, 19:26 -
Playing 1080p video on a higher res monitor
By Long Hot Summer in forum Newbie / General discussionsReplies: 6Last Post: 8th Jan 2011, 04:25 -
Shrinking 1080p MKV, change to 720p or keep as 1080p?
By Phat J in forum Video ConversionReplies: 5Last Post: 28th Nov 2010, 09:35 -
Which is better? [1080p Monitor comparison]
By Engineering in forum ComputerReplies: 5Last Post: 17th Dec 2009, 07:00 -
I'm stunned! Camcorder to 1080p monitor with HDMI!
By alegator in forum Newbie / General discussionsReplies: 10Last Post: 27th Jul 2008, 15:52