Hey guys,
I have a 1680 x 1050 screen for some time now. I've always been fine with it using DVI-I with perfect clear picture. However, I decided to use my TV in the mean time. So I've decided to keep my old DVI cable and plug it to my TV (my TV doesn't have an HDMI cable, but my monitor does) and got a new HDMI cable for my monitor. However, for some reason, when I plug in my monitor for the HDMI cable, the native resolution is now 1280x720. When I try to get it back to 1680x1050, it becomes really blurry and out of place due to the 720p resolution. I looked everywhere within NVIDIA control panel and Windows Display but came up with nothing. Thanks for all the help.
+ Reply to Thread
Results 1 to 8 of 8
-
-
HDMI likely has nothing to do with it. Sounds like your HDTV's native resolution is 1280x720. Whether Hdmi or dvi, your card is auto adjusting to match. When you are forcing the card to 1680x1050, that is what is being sent to the display and the display is resizing it down to 1280x720, because that's all it can end up being. Evidently your TV's resizer is not so good (no surprise).
Main choices: Leave it in auto/720, or buy a new display.
Scott -
There is something called EDID which the monitor sends to your video card to tell it what resolution / refresh rates it supports and they use this information to negotiate a compatible output signal for the display. The display is apparently telling your GPU that is supports 720p and not 1680x1050.
-
I understand. However, when I use the DVI cord, the display is perfectly clear and is up there near 1080p. However, the resolution is downscaled to 720p with an HDMI cord. I'm confused because I thought HDMI was the same thing as DVI but with sound. I guess I just have to buy another DVI cord.
-
TV's often only support industry standard TV resolutions via HDMI. That would be 1280x720 at 60Hz or 1920x1080 interlaced at 30 Hz.
-
Last edited by Lamanator; 14th Mar 2015 at 17:25.
-
Oh no, it is quite different!
One is innovative and open the other one is royalty generating and loaded with "standards".
It could very well be that your TV can easily handle 1680 x 1050 but the HDMI standards people don't like this so they prefer to cripple the signal to 720P.
Some people on this forum will be able to explain to you this is actually a brilliant feature, sorry, I am not one of them!
-
HDMI is a TV standard that has been adapted to computer use. TV's have standard resolutions, unlike computers where you can use pretty much anything that you want. Just because the computer is forced to adhere to the spec when used with a TV, does not make the whole format rubbish. Standards are set for a reason, so that hardware can interact properly with each other. Where in the TV specs/manual do you see "supports 16580x1050"? I bet you don't find it, but it will list all of the standard TV SPEC RESOLUTIONS that it supports.
List the TV brand/model and someone can confirm what is supported. Or check the User Manual. It should also note the native resolution of the screen, which will confirm what others have noted above.Google is your Friend
Similar Threads
-
DVI-I to ( DVI-D to HDMI )
By brightstar in forum ComputerReplies: 7Last Post: 11th Jul 2013, 08:22 -
LED MONITOR - HDMI to DVI-D cable, HDMI media player not working?
By krishn in forum DVB / IPTVReplies: 16Last Post: 25th Feb 2012, 16:20 -
Can I interface my VGA monitor through a HDMI-DVI-D cable via DVI-D to VGA?
By vinny88 in forum Authoring (DVD)Replies: 4Last Post: 14th Oct 2011, 08:31 -
Plug DVI/HDMI adapter to DVI video card?
By Stealth3si in forum Media Center PC / MediaCentersReplies: 25Last Post: 23rd Dec 2010, 19:32 -
Need help with HDMI, DVI-HDMI and "high definition audio device"
By uart in forum Newbie / General discussionsReplies: 7Last Post: 16th Mar 2010, 00:15