VideoHelp Forum




+ Reply to Thread
Results 1 to 8 of 8
  1. Hey guys, have a look at this thread here:

    forums.nvidia.com/index.php?showtopic=231113

    Ignore the initial problem as I now have the second TV hooked up HDMI to HDMI

    There was only one guy willing to help out on the nVidia forum so I thought I might check in here for a solution.

    Towards the bottom of the thread you'll see my question about the splitter cable. Would that be possible?


    -Have any of you had experience with 'DVI-D to component' or 'DVI-A to component'
    -What about HDMI to VGA? (if it exists)

    or.... would I definitely need to resort to buying a video card with two VGA outputs?
    Quote Quote  
  2. I'm not sure what you mean when you say you're using a VGA to HDMI conversion cable as I would have assumed (maybe incorrectly) the card can output two VGA signals with just a DVI to VGA adaptor on the DVI output. Something like this: http://en.wikipedia.org/wiki/FileVI-VGA-Adapter.jpg

    Which DVI connector does the card have? http://en.wikipedia.org/wiki/Digital_Visual_Interface#Connector
    The type with the 4 pins in a square configuration.... well those pins carry the usual analogue signal.
    I own two types of Nvidia card. A 7600 and an 8600. The first has dual DVI outs but I ran two VGA monitors off it using the above adpator(s) for years. The second is similar to yours. It has a VGA out and a DVI out. The card can run dual VGA monitors with an adaptor on the DVI out, the one in the second PC here is doing just that at the moment. The 8600 in this PC is currently running my CRT monitor while the DVI out is hooked up to my TV using a DVI to HDMI cable. Anything's possible, but I'd be surprised if your card can't run two VGA monitors simultaneously.

    While I can't speak for every TV..... logically there's no reason why HDMI to HDMI or DVI to HDMI shouldn't look just as good as VGA. If the TV overscans then text probably won't look as good as VGA (assuming it doesn't) as you're not getting "pixel for pixel" using the native resolution, but when watching video you probably won't see any difference. Fortunately my TV lets me disable overscanning so I can't really tell the difference between using the HDMI and VGA input. In fact the video played on the PC while connected via HDMI tends to look better than when using the Bluray player. Especially low resolution AVIs etc. The PC seems to be better at upscaling. Using the 8600 at least, the Nvidia control panel gives me an option to resize the desktop so while it might take a bit of fiddling you should be able to negate the overscanning that way. I can't say I've used it myself though.
    I don't know why the colors are different using HDMI. You may find the TV stores individual picture options for each input. Mine does. If the TV's picture options are the same then maybe it's a levels issue. My TV has a dedicated PC/HDMI input and it's the one which lets me change the black level (TV or PC levels). I'm pretty sure it defaults to TV levels but while video looks fine, Windows itself looks too dark. I change the input to expect PC levels which fixes the Windows colors but depending on the player/renderer you use that may make video look washed out. The Nvidia control panel also has an option to expand all video to PC levels so with the TV set to expect PC levels everything looks fine. There should also be an option to change the output color space to YCbCr444 instead of RGB when connected via HDMI. Try the other option to see if it looks any better.

    Does the video card recognize the TV when connected via HDMI as it does when connected via VGA?

    PS The levels for video output in the Nvidia control panel are labelled full range (PC levels) and limited range (TV levels). They won't effect the way Windows displays, just possibly video.
    Last edited by hello_hello; 13th Jun 2012 at 01:44.
    Quote Quote  
  3. Thanks for the big reply mate

    The first picture you linked to isn't there, but my video card, the Geforce 8400-GS, has this female DVI connection only without colour coding:

    http://upload.wikimedia.org/wikipedia/commons/6/65/DVI_pinout.svg


    I'm not sure what you mean when you ask:

    "Does the video card recognize the TV when connected via HDMI as it does when connected via VGA?"

    After the 'DVI to VGA' didn't work, I bought a HDMI cable and at first I had no picture but when I went to the normal display properties (not the nvidia control panel) I got it working by right clicking on the second display then clicking "attached". This caused it to recognise the second display.

    I had a TV last year that i hooked up via HDMI and I was able to switch overscan off to fit the picture on the screen. However, as for the picture quality itself it's just as bad as this new set up.

    I have tried 3 completely different TVs via HDMI over the last 3 years and they all look shocking so I have switched to VGA and havent looked back (until now)

    Since you asked me what the DVI port looks like, I have also looked at the 'DVI to VGA' converter cable. According to the diagrams, it is a DVI-I dual link.


    Extra notes:

    -The HDMI cable that I bought only costed $20 AUD.

    -The DVI to VGA converter cable was fairly cheap but I could only find it cheap on ebay.

    -Every other DVI to VGA cable that I came across on various sites were all fairly pricy. I should also point out that the cable came from hong kong. It doesn't look dodgy though. It has a red and black striped mesh coating which is fairly tough.

    I hope that all of this info can help shed some more light on the issue.

    Thanks again mate.
    Quote Quote  
  4. Ok so here's what's happening now.

    I updated my video card driver (which I didnt want to have to do) and after a reboot it decided to make the second TV the primary display by itself and amazingly the entire picture is on screen and the colours look a fair bit better too. The text is still hard to read but everything else has improved.

    BUT!!!!!! Now the first TV (VGA to VGA) can't handle 1920x1080 anymore. On that res it is fairly stretched and everything is off the left hand side. So I put it on its native resolution 1440x900 (which it was never able to do before the driver update) and that resolution works but I need it to be able to handle 1920x1080 like it used to.

    This is turning out to be a nightmare. You wouldn't think that updating a driver could make things worse.


    Edit: I wanted to roll back to the old driver but I dont have it any more, so unfortunately i'm stuck with this large display (1440x900) on both screens and wont be able to buy new TVs any time soon. I think it's safe to say there are no real solutions here....

    Next time I buy TVs they will be a more common brand and I will look into the specs properly.


    What I don't understand... and this is what I want somebody to explain.... is why the old nVidia driver (190.xx) acted like 1920x1080 was the native resolution of these TVs and it couldnt handle their actual native resolution which is 1440x900.

    ....After updating the driver, it couldn't deal with 1920x1080 but it was happy with the real native resolution (1440x900)

    Very very strange.
    Last edited by meneedit; 13th Jun 2012 at 07:06.
    Quote Quote  
  5. "The first picture you linked to isn't there, but my video card, the Geforce 8400-GS, has this female DVI connection only without colour coding"

    Just so we're on the same page, it also has a VGA connector? In other words it looks something like this?
    http://di1-4.shoppingshadow.com/images/pi/c5/1d/28/106657635-260x260-0-0_evga+evga+gef...ddr3+pci+e.jpg

    If so....
    Here's another pic of an adapter http://www.websistems.com/tienda/BBDD/imagenes/adaptadorDVI_HDMb15.jpg
    As the DVI connector also carries the analogue signal (assuming the four pins on the end aren't empty on your card) it should just be a matter of using an adapter to add a second VGA connector to the card instead of the DVI connector. My Nvidia cards all came with those same adapters but maybe manufacturers don't supply them any more.
    I use the word "adapter" rather than "converter" because they don't actually convert the signal in any way, they just use the existing analogue signal while changing the connection type to VGA. Unless for some reason Nvidia no longer make cards which can output two analogue signals, which I kind of doubt, your card should be capable of running two VGA monitors as mine is (my newest video card is several years old). My cards don't have a HDMI output, but a DVI to HDMI adapter works fine as the DVI and HDMI signals are the same.

    "Does the video card recognize the TV when connected via HDMI as it does when connected via VGA?"

    I meant does it still see the TV as a "Mitac display" or just as a generic monitor? Not that it matters now as it seems to be working.

    Do you have a manual for the TV? With any luck it'll have a list of supported input resolutions and refresh rates. VGA and HDMI may be different. Whatever it supports though, I'd imagine it should accept a standard 720p or 1080p/i signal (are the TVs high definition or full high definition?) and the TV would scale that to it's native resolution internally. Being TVs I'd imagine you'd always want to set the refresh rate to 60hz (or 50hz for PAL). The TV may accept other refresh rates (at least via VGA) but they may effect the input resolution it'll handle and/or whether it'll display properly. For example using VGA at 1920x1080 my TV only accepts a 60hz signal. If I up the refresh rate the maximum resolution drops but some combinations work well while others, not so much.

    As my TV has a dedicated HDMI input for PCs I think that input is less fussy than the rest, but I just send a standard 1080p signal from the video card at 50hz (or 60hz) and it works no matter which HDMI input I use to connect the PC. To be honest if you're feeding the TV a "standard" signal.... the same as any Bluray player would send it, there's no reason why HDMI or DVI or VGA all shouldn't look fine (aside from not being able to disable overscanning), but if your TV has a manual which lists supported refresh rates and resolutions then that'd be a good place to start. If the TV lists a resolution/refresh rate combination which doesn't appear in Display Properties, the Nvidia control panel should let you create and save a custom resolution. Or if you can't get HDMI to play nice, hopefully one of those adapters will let you run two VGA outs.
    Quote Quote  
  6. Thanks again man,

    Yeah, that's the exact same video card. Unfortunately these TV's are not full HD.... I found out last night.

    Ever since updating the driver, one of the TVs is recognised as 'MTC26T42' and the other is 'Mitac Analog Display' whereas before the update they were both recognised as 'Mitac Analog Display'.

    I dont have the older driver (forceware 190) so now im stuck with 1440x900
    Last edited by meneedit; 13th Jun 2012 at 23:35.
    Quote Quote  
  7. I found the old driver and reinstalled it, so the first display is back to how it was.

    ....But I'm just gonna have to get myself a couple of Full HD TV's at some stage. I'll be taking my computer into the shop to test them out though to make sure i'm not wasting my money.

    I just have to hope that a 'DVI-A to VGA' or 'DVI to HDMI' adapter will work and as you say, they should.

    I hope this thread and the linked thread at the top helps out anybody else struggling with setting up two TV's with their PC. Thank you for all the info mate, you've been extremely helpful
    Quote Quote  
  8. Actually asking if the TVs are full HD was probably a dumb question because with a native resolution of 1440x900 they're obviously not, although they do seem to accept a full HD signal. My video card also lets me send the TV a 24hz or 30hz interlaced signal over HDMI so it might be worth looking at the refresh rate. I'm not sure I'd fuss about them not being full HD if you can get them working properly. My TV is full HD but 99% of what I watch isn't. It's usually 720p or less and even when I compare 720p and 1080p Bluray encodes I struggle to see a difference most of the time.

    I'd have thought ideally you'd want to send the TV a 1440x900 signal, at least over VGA, if that's the TVs native resolution and it looks right. Anything else and the PC would be rescaling video to a different resolution (say 1920x1080) and then the TV would be rescaling it again to 1440x900. In theory it sounds like it'd be better to send it a 1440x900 signal in the first place and let the PC rescale any video to that resolution. I'm just not sure how it works in practice though because TVs are usually 16:9 but 1440x900 isn't.

    "Ever since updating the driver, one of the TVs is recognised as 'MTC26T42' and the other is 'Mitac Analog Display' whereas before the update they were both recognised as 'Mitac Analog Display'."

    Well I guess you'd hope it wouldn't be identifying itself as "Mitac Analog Display" using HDMI, but at least it seems to be conversing with the video card correctly and not being seen just as a generic monitor.

    Sorry I can't really be of more help. Hopefully a DVI-VGA adapter will get you running two VGA connections correctly. Maybe I've been lucky but while I've only connected a PC to a TV a couple of times, both times it was fairly easy over HDMI..... I just set the video card to 1920x1080 and it was done. I was actually kind of surprised HDMI didn't look a little better than VGA, but it certainly didn't look any worse. Anyway.... good luck.
    Last edited by hello_hello; 14th Jun 2012 at 14:40.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!