Hello folks! Just registered to this board because I went searching for "DVI to HDMI quality loss" and I found an old post from 5 years ago here. Since technology is about four times better than back in 2004, I thought I'd re-ask this question.
I just bought a brand new computer with 24" Samsung monitor that is glorious. Driving up to San Francisco to see a friend I couldn't tell if I was still in the Grand Theft Auto universe or not. But when I plugged in my box to my friend's 42" Philips TV through the HDMI adapter, the quality freaking BLOWS!!!
It says that it's displaying 1920x1080, but it's of such atrocious quality that we're not going to have any fun watching Blu-Rays or playing Grand Theft this weekend. I should have brought the trusty Samsung.
What do I do? I don't suppose this is a driver issue, but every old post I found regarding this topic said that there would be NO quality loss. The HDMI standard is several years newer and is capable of much greater throughputs, but I honestly thought the picture would still look sweet with the adapter that came with the Radeon 4890.
Any feedback, or is this just par for the course? I am VERY glad I purchased a computer monitor and NOT a TV for this computer.
Bonus question: WHY do modern video cards still utilize DVI?! I'm scratching my head why they include two DVIs and no HDMI. Oh well, I'll upgrade again in 2012, just in time for the end of the world, baby!
Thank you very much in advance. Happy Layba Day.
--Bilbob42b / yahoo
+ Reply to Thread
Results 1 to 10 of 10
-
-
Well bilibob..
I'm fairly certain you're "quality loss" is not the "Latest Video News" 8) -
Oh blah! Go ahead and delete the thread then. I think this IS video news because I just bought the latest & greatest and I'm telling you my experience with it. I can also tell you the hell I went through with PowerDVD, WinDVD and Total Theatre (hint, get the ArcSoft program), but whatever.
Bottom line: I was recommended the Samsung monitor many times over, and I'm glad I went with it. I tweaked some settings and increased the picture quality on the Philips, but the plain truth is that the Samsung Syncmasters are superior monitors. The difference IS noticeable!
As I said on another board devoted to Windows 7, I sincerely wish that people would post plain-English guides that get rid of all the BS. When I searched for information on the HDMI adapter, I found posts that were all from 2004. That's ancient history in the world of computers. My new computer has a Passmark score four times higher than my computer from 2004.
Anyway, when it comes to HD video, there are a LOT of options. You're best off going with ArcSoft and Samsung on the computer. Amazon has a BD-ROM for a whopping $80 these days. And that's all the news fit to print.
MODERATORS: DELETE THREAD IMMEDIATELY, PLEASE!!! -
I'm gonna venture an educated guess. I believe it all comes down to resolution and scaling/conversion.
First, the cable would make NO difference, assuming it's a digital cable--which it is. DVI or HDMI. They're just pipes that should output bits identical to the input (and even if they were slightly off, there is error correction to fix it).
Second, what's the NATIVE resolution(s) of your Samsung, and what's the native rez of the Philips? Because if they DON'T MATCH the outputted resolution of the PC's video card, the monitor is going to have to up/down-scale it to the resolution of the screen. And scaling often loses quality (UP much more than DOWN). And there are many degrees of quality to scaling algorithms.
Maybe the Samsung's native rez matches the native rez of the card--NO scaling at all! But maybe that's not the native rez of the Philips-and it has to upscale. Well, there you go. Big diff in quality.
Also, since you didn't give full model #s it's not easy to compare, but it sounds like your Samsung (being a "Sync"-master) is a multi-rez CRT so is much more likely to match the vid card. And I can guess that the Philips TV is NOT a CRT (most aren't anymore), so it doesn't likely have multi-sync, so its chance of matching is much less.
Plus, you're comparing 26" to 42". That's almost 2x the size! That means that even at the same rez and all native, the pixels are going to be twice as big (and hence, TWICE AS BLURRY, unless you move much further away and negate the relative size increase).
Plus there are other factors: refresh, black level/contrast ratio, color purity, etc., that may be affecting the difference.
Less likely, but also possible, there might be a problem if there is some mismatch in the colorspace and a colorspace conversion algorithm in use may be substandard.
Then, there's just your subjective impression about what makes for a good image...
Scott
But yeah, this ain't news... -
News is when something important such as a new product or software is released,not being able to produce a decent picture on a hdtv from a computer is called something not setup right and cant figure out how to fix it.
I get great picture quality when i stream a 1080p video from my computer to my 37 inch hdtv and newer video cards have hdmi.I think,therefore i am a hamster. -
johns0
Originally Posted by bilbob42
I just installed a new PCI-e 1gb video card in my sons new PC tonight and i should write a whole article on it and post it in "Latest Video News"
Maybe you should read the VERY FIRST THREAD in this forum....
Latest News forum: How to use correctly
https://forum.videohelp.com/topic290836.html
Members are asked to use common sense when posting.
Post any video related news here such as new or updated tools, hardware, guides, sites and similiar.
ONLY directly Video related news please -
HDTVs do not make good computer monitors unless they have a pixel-for-pixel mode.
-
Gotta be a scaling problem.
1) I have a Philips 47" LCD 1080p 120Hz HDTV. Fairly new set, nice picture, but the scaling is poor. Since mine is one of Philips' better sets, the scaling probably stinks on your friend's set as well. :P Feed it something it likes.
2) Got a Radeon card? Go into Catalyst Control Center -> Graphics Properties -> DTV (DVI) -> HDTV Support. My guess is you should add an HDTV optimized 1080p60 format to the display manager and use it. And set the overscan in Scaling Options while you're there, if necessary.Pull! Bang! Darn! -
My Samsung 4665 has a pixel-for-pixel mode (Samsung calls it "just scan"). I feed it 1920x1080p60 from my HTPC and everything is perfectly sharp, just like a big computer monitor.
Without a pixel-for-pixel mode most HDTVs will scale the incoming signal -- even if that signal is at the panel's native resolution. This is because they simulate overscan (because there is often crap at the edges of the frame on TV broadcasts and people would complain otherwise -- we see it all the time here). For example, a 1920x1080 signal being fed to a 1920x1080 panel will have the center ~1820x1024 portion of the image digitally scaled up to 1920x1080 before it's put on the screen. Small text will look like crap.
Similar Threads
-
LED MONITOR - HDMI to DVI-D cable, HDMI media player not working?
By krishn in forum DVB / IPTVReplies: 16Last Post: 25th Feb 2012, 16:20 -
Plug DVI/HDMI adapter to DVI video card?
By Stealth3si in forum Media Center PC / MediaCentersReplies: 25Last Post: 23rd Dec 2010, 19:32 -
DVI - HDMI - DVI problem
By rbjt in forum Media Center PC / MediaCentersReplies: 0Last Post: 4th Aug 2009, 23:14 -
Default Connecting laptop to HDTV - wanting best quality HDMI to DVI conn
By mr-scarface in forum DVB / IPTVReplies: 8Last Post: 24th Aug 2008, 13:59 -
DVI-D cable in DVI-I slot-possible? Quality loss?
By bigshotceo in forum DVB / IPTVReplies: 1Last Post: 21st Dec 2007, 06:49