VideoHelp Forum
+ Reply to Thread
Results 1 to 15 of 15
Thread
  1. Member
    Join Date
    Dec 2008
    Location
    Sweden
    Search Comp PM
    Hi!

    My LCD TV has a VGA input contact and I was wondering of it's posible to connect to it from my VGA output on my graphics card? I tried it quickly but didn't get any picture?
    Are they compatible? What do I need to think about when doing this?

    Thanks!
    Quote Quote  
  2. Member yoda313's Avatar
    Join Date
    Jun 2004
    Location
    The Animus
    Search Comp PM
    They should work just fine. What resolution is your tv capable of? If its a 720p your maximum is probably 1366x768 (that is my max on my 720p Westinghouse).

    I think your safe resolutions to try would be 800x600 to 1366x768 (these are ntsc resolutions mind you - please adjust for pal as needed).
    Donatello - The Shredder? Michelangelo - Maybe all that hardware is for making coleslaw?
    Quote Quote  
  3. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    What is your display card?
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  4. Member
    Join Date
    Mar 2008
    Location
    Near the Beach
    Search Comp PM
    Just make sure, that you have secondary monitor enabled.
    display properties - settings
    Quote Quote  
  5. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    If your card is a recent NVidia, I went through S-Video, YPbPr and VGA connection in this thread. Reply here not there.
    https://forum.videohelp.com/topic344750.html
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  6. Member
    Join Date
    Dec 2008
    Location
    Sweden
    Search Comp PM
    I have a Geforce FX 5200 card.
    How do I enable a secondary monitor? I'm not gonna have two monitors connected at the same time. I'm gonna switc from my computer monitor to the TV. Do I still need to enable a secondary monitor?
    Quote Quote  
  7. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    Originally Posted by yoda313
    They should work just fine. What resolution is your tv capable of? If its a 720p your maximum is probably 1366x768 (that is my max on my 720p Westinghouse).
    Might need to switch to vga input on your lcd tv thru setup.

    My 37inch lcd sanyo says its 1366x768 at 720p but when i go from computer dvi to hdmi it gets set to 1920x1080,any reason for this?Just read some other info and said it was due to scaling.
    I think,therefore i am a hamster.
    Quote Quote  
  8. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by badtant
    I have a Geforce FX 5200 card.
    How do I enable a secondary monitor? I'm not gonna have two monitors connected at the same time. I'm gonna switc from my computer monitor to the TV. Do I still need to enable a secondary monitor?
    Unless your computer monitor matches native resolution to your TV (unlikely) you will need to reset settings each time you switch. You will need the computer monitor connected to make the change. What is the native resolution of your TV? Or if you don't know, what is the TV model number? Same for the computer monitor.

    The 5200 is fairly weak but can handle dual monitor. Does yours have one or two DVI connectors?
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  9. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by johns0
    Originally Posted by yoda313
    They should work just fine. What resolution is your tv capable of? If its a 720p your maximum is probably 1366x768 (that is my max on my 720p Westinghouse).
    Might need to switch to vga input on your lcd tv thru setup.

    My 37inch lcd sanyo says its 1366x768 at 720p but when i go from computer dvi to hdmi it gets set to 1920x1080,any reason for this?Just read some other info and said it was due to scaling.
    Your display card has HDMI? Reset it to 1366x768.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  10. Member yoda313's Avatar
    Join Date
    Jun 2004
    Location
    The Animus
    Search Comp PM
    Originally Posted by johnso
    My 37inch lcd sanyo says its 1366x768 at 720p but when i go from computer dvi to hdmi it gets set to 1920x1080,any reason for this?Just read some other info and said it was due to scaling.
    Originally Posted by eddv
    Your display card has HDMI? Reset it to 1366x768.
    Actually it sounds like Johnso is using a single cable dvi-hdmi adapter. I used one for awhile and was able to get 1366x768.

    I do remember that before when I had nvidia I could set it to 1360x768 but I had to get updated drivers to max out at 1366x768.

    @johnso - if you are only getting 1080i and not the full 1366x768 it might be something to do with the graphics card. Make sure the display properties is set to 1366x768 or as close to that as you can.

    The only thing is your tv might not have pixel for pixel capability. Or you might have to manually select that. I remember reading in other threads here that it has multiple terminology on different brands. You might have dig through your tv manual to find the right setting to get bit for bit resolution. Or turn off autoscaling if that is possible.


    All else fails make sure you have the latest graphics drivers for your card. Something might be messed up somewhere deep that you can't get to if its not outputting at 1366x768.

    Good luck.
    Donatello - The Shredder? Michelangelo - Maybe all that hardware is for making coleslaw?
    Quote Quote  
  11. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    If i'm getting 1920x1080i on my lcd tv from my video card then its 540p i guess
    I think,therefore i am a hamster.
    Quote Quote  
  12. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by johns0
    If i'm getting 1920x1080i on my lcd tv from my video card then its 540p i guess
    Maybe*. It depends on your HDTV native resolution and how the TV processor handles 1080i input. Typically 1080i goes through deinterlace (potentially destructive) or "cinema" inverse telecine then gets scaled to native resolution (typ. 1366x768). Even if the HDTV is 1920x1080p, the image still gets upscaled for overscan unless the pixel for pixel option is specified.

    Pixel for pixel gets the clearest computer picture but allows video edge defects to become visible.


    * 960x540p (quarter size) is typical for CRT and some projection HDTV systems. Actual resolution is often less due to overscan and display limitations.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  13. Member
    Join Date
    Dec 2008
    Location
    Sweden
    Search Comp PM
    The TV is a Sharp LC-32SA1E. The port says "Analogue RGB (PC/Component)". Will try to set down the resolution and start it up.

    Should work right?
    Quote Quote  
  14. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Your manual should list accepted PC-Port (VGA) resolutions.

    Follow instructions in the manual.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  15. Member
    Join Date
    Jul 2011
    Location
    New Jersey
    Search Comp PM
    NVIDIA cards seem to have more comprehensive software management tools that come with the driver themselves, however if you have an older card, such as the 4 year old ATI card that I am working with, you can download a free program called Powerstrip, this allows you to set different custom resolutions and refresh rates, essentially overriding settings that Windows would not allow you to set. You can also tweak the aspect ratio with the advanced timing settings options, so that when you find a resolution that works best ffor your particular LCD TV, you can tweak it vertically and horizontally so that nothing is cut-off on any part of the screen. I am connecting via VGA, whether or not DVI would be a better choice to get the resolution, refresh rate, and aspect ratio as crisp and as close to its native resolution as possible, but the free program Powerstrip has done wonders for me. The free version of it is more than enough to accomplish the customization that many user's are looking for when trying to optimize their LCD displays, or and HDTV display they have when connecting to a PC. I should also note that I have a splitter, so that the image appears on both my HDTV and Monitor, and it allows each display to carry its own resolution, even though I am not extending the display, I am simply cloning the same image. I am no expert, but I know that after months of straining to find a solution to get my TV to display my PC in its native resolution, Powerstrip finally allowed me to get it almost perfect. If my graphics card wasn't so old I wouldn't have had to go through a 3rd party software solution, but those who are still having issues should give it a shot. Note: if you ever make any changes in Powerstrip and the image never reappears on the screen, and remains black, just hit escape and it will undo whatever change you made that was incompatible with your card/TV. Hope this helps someone.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!