VideoHelp Forum




+ Reply to Thread
Results 1 to 11 of 11
  1. Member
    Join Date
    Sep 2003
    Location
    University of Ottawa
    Search Comp PM
    Well, today I decided to finally test how good my tv's picture quality is. Unfortunately, when I hooked it (westinghouse w4207) to my computer, I noticed the signal was going through, but there were a lot of weird colour artifacts. I've included a picture so you can see what I mean. Specs are as following:

    Computer:
    -Video Card: pny 6800gt with latest nvidia drivers. Two DVI-I out ports; port one is connected to my CRT computer monitor via an adaptor, and port two is connected via a dual-link DVI-D cable to my TV. I've tried all the settings in the nvidia control wizard that should cause the signal to work properly (setting to 1366x768, setting 720p signal, etc..)

    -TV: Westinghouse w4207

    Does anyone have any ideas what could be causing the problem or at least what type of problem it is?

    UPDATE: I tried disconnecting the CRT monitor and moving the LCD TV to the first DVI slot. While there were no distortions/artifacts, I noticed that the screen had a distinct red tone; for example, screens that were normally black and white (such as the bios starting screen) were red and white.

    UPDATE: Turned out to be a bad DVI cable. I replaced the cable and everything's working fine.



    monitorproblem.jpg
    Quote Quote  
  2. Make sure the refresh rate is set to 60 Hz.
    Quote Quote  
  3. Member
    Join Date
    Sep 2003
    Location
    University of Ottawa
    Search Comp PM
    Originally Posted by jagabo
    Make sure the refresh rate is set to 60 Hz.
    Yup, set to 60 Hertz, still is not working properly.....
    Quote Quote  
  4. I have an 8600GT hooked up to my Samsung HDTV through a DVI->HDMI cable at 1920x1080p60. I don't have any problems. But in the nVidia Control Panel, under Video and Television, there's a section called Change Signal or HD Format. You might try changing some of the settings there.
    Quote Quote  
  5. Member
    Join Date
    Sep 2003
    Location
    University of Ottawa
    Search Comp PM
    Originally Posted by jagabo
    I have an 8600GT hooked up to my Samsung HDTV through a DVI->HDMI cable at 1920x1080p60. I don't have any problems. But in the nVidia Control Panel, under Video and Television, there's a section called Change Signal or HD Format. You might try changing some of the settings there.
    Tried that too, with no luck. I'm beginning to think it's a hardware issue that's causing the colour problems. I should mention, though that the Svideo from my xbox works fine. Could it maybe be a problem that I'm using a DVI-D cable while my TV is DVI-D and my computer's video card DVI-I?
    Quote Quote  
  6. The DVI-I port on your graphics card has a few extra pins for carrying analog video. The DVI-D cable simply doesn't have those pins so doesn't carry the analog data to your DVI-D TV which wouldn't use them anyway.

    Can you get a clean picture using any resolution? Many HDTVs will only accept certain resolutions via DVI or VGA connectors -- like 1024x768 60 Hz. Have you tried another cable?
    Quote Quote  
  7. Member
    Join Date
    Sep 2003
    Location
    University of Ottawa
    Search Comp PM
    Originally Posted by jagabo
    The DVI-I port on your graphics card has a few extra pins for carrying analog video. The DVI-D cable simply doesn't have those pins so doesn't carry the analog data to your DVI-D TV which wouldn't use them anyway.

    Can you get a clean picture using any resolution? Many HDTVs will only accept certain resolutions via DVI or VGA connectors -- like 1024x768 60 Hz. Have you tried another cable?
    Nope, no resolution seems to prevent the severe red tint/distortion. The user manual recommends 1366x768@60hz, which I tried..........

    I haven't tried another cable, so I'll probably order another one on the internet.
    Quote Quote  
  8. You may be able to find specific information on your HDTV at avsforum.com:

    http://www.avsforum.com/avs-vb/showthread.php?t=741609&highlight=4207

    If you figure out what's wrong please post back.
    Quote Quote  
  9. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Two ways to connect

    DVI-I to DVI-D (doesn't always work with some TV sets at other than at 1080i/720p)

    DVI-I -> VGA Adapter -> VGA (This input expects 1366x768 60 Hz. VGA)

    NVidia cards "autodetect" which means the TV path must be connected and on when you reboot. ATI has manual overrides that I wish NVidia would add.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  10. Originally Posted by edDV
    NVidia cards "autodetect" which means the TV path must be connected and on when you reboot. ATI has manual overrides that I wish NVidia would add.
    Hmmm. I don't seem to have that problem using a DVI->HDMI cable from a nVidia 8600GT to my Samsung HDTV. Maybe it's because the TV is never really off unless the power goes out or the TV is unplugged.

    By the way -- I notice the OP has resolved his problem and posted the solution in the first post. It was a bad DVI cable.
    Quote Quote  
  11. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    True the Samsung's ports remain active. I tested an LG DVD player that shut off the ports. It needed a reinstall each time.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!