Hi all,
I've got an older Dell Latitude D600 laptop with a Radeon Mobility 9000 video card. The card supports TV and FPD out modes. When I use the S-Video all looks great, I can run the video out to my 1080i projection TV. However, if I use my DVI-HDMI cable (DVI is PC side, HDMI is TV side), I'm having problems. I can see that both the TV (S-Video) and FPD (DVI) outputs are illuminated, so the PC detects connections to the monitors on both. Meaning, I guess it senses both the analog S-Video and digital DVI lines are working.
If I just try to boot the PC while watching the DVI-HDMI input on the TV, I can see the Windows XP boot sequence fine, but just before Windows automatically logs in and starts the Desktop, the screen loses its video. So I'm wondering a few things here:
1. Do I need to disable my analog connection to the same TV, even though it's on a different input channel, in order for the DVI to work?
2. Is it possible that my TV doesn't "understand" what the PC is sending once the Windows ATI drivers actually start to load after the initial Windows XP boot screen that uses plain VGA drivers?
Thanks in advance for any help guys.![]()
Best,
B
+ Reply to Thread
Results 1 to 5 of 5
-
-
1 Unlikely but turn off analog to get DVI working first.
2. Probably. What make / model TV and what resolution are you sending to it?Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
1. So what's weird is I can see the PC start up on the DVI connection, but like I said it drops when I get into Windows. But if I go back to the Mobility property sheets, I can see that both the analog S-Video and digital DVI connections are lit and active. I've even tried switching each to be the primary, and the check the DVI-HDMI input channel but nothing ever seems to get through.
2. The TV is a Sony Grand Wega 3LCD 1080i rear projection (Model #: KDFE60A20). I've tried several different resolutions for the DVI connection: 640x480, 800x600, 1024x768. All three of those resolutions work on the S-Video connection.
Thanks again,
B -
Originally Posted by shdowflare
2. That TV is unlikely to accept other than 1080i or 720p on the DVI-D (HDMI) port. Check your manual.
According to Amazon tech specs for the KDFE60A20, you have a D-Sub 15-pin VGA port. That is your best bet for computer interface. Check your manual for resolutions supported or call Sony. The LCD native resolution is 1366x768. You can probably drive your VGA output at that resolution setting and have the TV respond. If not try 1024x768.
DVI-I connectors include pins for DVI-D and VGA. To get VGA out from your DVI-I connector, you should have a DVI-I to VGA adapter similar to this one.
-
Thanks for the help. I didn't see a VGA port anywhere on my set, but I'll double-check. In any case, I appreciate all the info.
-B
Similar Threads
-
LED MONITOR - HDMI to DVI-D cable, HDMI media player not working?
By krishn in forum DVB / IPTVReplies: 16Last Post: 25th Feb 2012, 16:20 -
Can I interface my VGA monitor through a HDMI-DVI-D cable via DVI-D to VGA?
By vinny88 in forum Authoring (DVD)Replies: 4Last Post: 14th Oct 2011, 08:31 -
Plug DVI/HDMI adapter to DVI video card?
By Stealth3si in forum Media Center PC / MediaCentersReplies: 25Last Post: 23rd Dec 2010, 19:32 -
Need help with HDMI, DVI-HDMI and "high definition audio device"
By uart in forum Newbie / General discussionsReplies: 7Last Post: 16th Mar 2010, 00:15 -
DVI - HDMI - DVI problem
By rbjt in forum Media Center PC / MediaCentersReplies: 0Last Post: 4th Aug 2009, 23:14