VideoHelp Forum
+ Reply to Thread
Results 1 to 5 of 5
Thread
  1. Any advantage to using one over the other DVI or RGB on a monitor? Particularly an LCD monitor? I'm not certain I see any real difference.
    I am using an ATI AIW 9000 pro card which is DVI out or RGB with the adapter. The LCD monitor has both inputs.
    One thing that is annoying is that the ATI card does not send the signal to the LCD during computer boot up (post) when connected to the DVI input but it does with RGB. I can live with that if the DVI is better.
    ATI has no usefull information on solving this problem on their website but they do acknowledge the problem.

    Thank you knowledgeable readers.

    --dES
    "You can observe a lot by watching." - Yogi Bera
    http://www.areturningadultstudent.com
    Quote Quote  
  2. Member
    Join Date: Dec 2005
    Location: United States
    Search Comp PM
    Originally Posted by Des
    Any advantage to using one over the other DVI or RGB on a monitor? Particularly an LCD monitor? I'm not certain I see any real difference.
    I am using an ATI AIW 9000 pro card which is DVI out or RGB with the adapter. The LCD monitor has both inputs.
    One thing that is annoying is that the ATI card does not send the signal to the LCD during computer boot up (post) when connected to the DVI input but it does with RGB. I can live with that if the DVI is better.
    ATI has no usefull information on solving this problem on their website but they do acknowledge the problem.

    Thank you knowledgeable readers.

    --dES
    hi,
    well primarily, especially with your setup.. going from the dv output to your dv monior input...
    1. you'l have a large bandwidth which will help in having a better display on your lcd monitor
    2. and since your already sending directly digital signal to the dvi monitor input... the monitor does have to waste time in converting the signal.. ie... things are little faster!!

    now along with this, overall the fact that you have two different inputs give you a backup if you ever lose the digital input part of the monitor goes bad yo can easily switch over to analog!!
    Quote Quote  
  3. Member
    Join Date: Oct 2004
    Location: United States
    Search Comp PM
    simple..the dvi is a digital interface, the quality of the image from the output of your video card to the input of your monitor should stay constant.

    the 15 pin video cable is analog. There is quality loss as it travels along the cable. This is why you see blurry video and ghosting on long stretches of this type of cable.
    Quote Quote  
  4. Member edDV's Avatar
    Join Date: Mar 2004
    Location: Northern California, USA
    Search Comp PM
    I'm assuming this is a computer LCD monitor. It should have a native panel resolution. Normally you will set the DVI-D or VGA to that same native resolution for best quality.

    DVI-D is a direct digital connection, VGA uses the RAMDAC to convert digital to analog and the monitor A/D converts back. DVI-I connectors include pins for both DVI-D and VGA. For most monitors you won't notice a great difference in quality for short connections (up to 15-20 feet). The higher the LCD "native resolution" the more likely DVI-D will look better.

    For longer cables, they both degrade but in different ways. Better quality cables are needed to go longer than 20 feet for both.
    Quote Quote  
  5. Thanks all for the replies. The monitor is a 19" wide with 1440 x 900 resolution. The ati card is set to match this (the limit on the card). So as I am understanding this the DVI gives me faster reponse from the monitor and a more accurate signal and representation, make sense. I have noticed that fine text is a little clearer with the DVI than the RGB input.

    Now, any suggestions as to how to get the ATI card to display to the monitor on post in dvi rather than on OS boot?

    Cheers!

    --dES
    "You can observe a lot by watching." - Yogi Bera
    http://www.areturningadultstudent.com
    Quote Quote  



Similar Threads