VideoHelp Forum




+ Reply to Thread
Results 1 to 8 of 8
  1. Member brassplyer's Avatar
    Join Date
    Apr 2008
    Location
    United States
    Search Comp PM
    On a 22" Acer HD monitor, comparing the VGA input -vs- the DVI-D input coming from an Nvdia GeForce GTX 460 Cyclone running Battlefield Bad Company 2, I don't see any difference. Should I?
    Quote Quote  
  2. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    Nope,but in a higher resolution with a bigger monitor you will see a bit of improved difference.
    I think,therefore i am a hamster.
    Quote Quote  
  3. Banned
    Join Date
    Oct 2004
    Location
    New York, US
    Search Comp PM
    Originally Posted by johns0 View Post
    Nope,but in a higher resolution with a bigger monitor you will see a bit of improved difference.
    Generally the difference favors VGA, which is faster with fewer artifacts and less rounding of data (and even that depends on specific circuitry in specific cards) . There is nothing wrong with 22" HD monitors; overall performance isn't dependent on size. But buying the cheapest monitors you can get your hands on won't make much difference.
    Last edited by sanlyn; 25th Mar 2014 at 01:52.
    Quote Quote  
  4. Member brassplyer's Avatar
    Join Date
    Apr 2008
    Location
    United States
    Search Comp PM
    Originally Posted by johns0 View Post
    Nope,but in a higher resolution with a bigger monitor you will see a bit of improved difference.
    Not sure what you mean by higher resolution. This particular monitor has a 1080p native resolution, game is at max settings.
    Quote Quote  
  5. The difference is that VGA can pickup noise through the cable which will appear as thin diagonal lines across the screen. DVI data starts as digital and stays digital, VGA has to be converted to analog first then converted back to digital by the monitor circuitry. If you see no difference, great; don't go nit picking, you'll only get disappointed. It's like back in the days of $1000 Trinitron monitors, if you wanted to be mean to someone you only had to point the 2 faint black horizontal lines across the screen; that would drive them mad.
    Quote Quote  
  6. I had grounding issues which gave hum bars on a VGA input; digital was fine. Now that I have fixed the grounding, I don't notice any difference between the analog and digital images.
    Quote Quote  
  7. Banned
    Join Date
    Oct 2004
    Location
    New York, US
    Search Comp PM
    Originally Posted by nic2k4 View Post
    The difference is that VGA can pickup noise through the cable which will appear as thin diagonal lines across the screen. DVI data starts as digital and stays digital, VGA has to be converted to analog first then converted back to digital by the monitor circuitry. If you see no difference, great; don't go nit picking, you'll only get disappointed. It's like back in the days of $1000 Trinitron monitors, if you wanted to be mean to someone you only had to point the 2 faint black horizontal lines across the screen; that would drive them mad.
    Nonsense.
    Last edited by sanlyn; 20th Jan 2013 at 16:02.
    Quote Quote  
  8. Member brassplyer's Avatar
    Join Date
    Apr 2008
    Location
    United States
    Search Comp PM
    Originally Posted by sanlyn View Post
    None of that explains why gamers and graphics pros themselves don't use DVI-I.
    I've never taken a survey but you don't think gamers like DVI?

    What makes me wonder is that this particular video card - GTX 460 Cyclone - is certainly a gamer's card and has a DVI port.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!