VideoHelp Forum




+ Reply to Thread
Results 1 to 12 of 12
  1. Member
    Join Date
    Mar 2007
    Location
    United States
    Search Comp PM
    Hi everyone,

    I'm new to the forum as well as the HTPC scene so please excuse me if I sound like a newb. I recently got myself a Samsung 26" LCD HDTV. I have it temporarily connected through the VGA port. I need opinions on what could be the best way to play high-def content as well as DVDs.
    The problem is that my room is somewhat space constrained, so I can't have too many separate units near my desk. My goal is to play everything through my PC. What kind of video cards do you guys recommend for me to get? I currently have a Radeon 9200 with DVI and VGA out. I also have a DVI to HDMI cable but that doesn't look too good (and I haven't had time to calibrate it through powerstrip). I keep hearing that there are several graphic cards out now with HDMI out...what do you guys suggest I do? Before I spend any more $$$ on my setup, I'd like to do some research.
    Quote Quote  
  2. Figure out why the DVI to HDMI doesn't look good. In terms of picture quality it should be the same as native HDMI. You probably just need to set your card to the right resolution -- the native resolution of your HDTV.
    Quote Quote  
  3. Member
    Join Date
    Mar 2007
    Location
    United States
    Search Comp PM
    jagabo, excuse me if this sounds like a redundant question here, but is Powerstrip the only way to go about getting the right resolution/refresh rate from a dvi->hdmi connection? the native resolution on my lcd tv 1360x768. Also, does the cable make a big impact on the quality? I purchased a cheap cable off Newegg rather than the expensive one from Be$t Buy.
    Quote Quote  
  4. The cable should make no difference unless it's defective.

    Normally with DVI the monitor tells the computer what resolution it is. Then you can set it using the regular Control Panel Display applet. An HDMI monitor hooked up to a DVI port probably doesn't do this. Have you tried the "List All Modes" option in the Display Applet? Control Panel -> Display -> Settings tab -> Advanced button -> Adpator button -> List All Modes button.
    Quote Quote  
  5. Member
    Join Date
    Mar 2007
    Location
    United States
    Search Comp PM
    Yes, I have tried different resolutions. My monitor supports 720p and 1080i. However, the screen appears rather grainy and text is all blurred in 720p and 1080i. And under 1080i, my screen is so large that my start button disappears into oblivion.
    Quote Quote  
  6. Member
    Join Date
    Mar 2005
    Location
    United States
    Search Comp PM
    You definitely should not be losing any type of picture quality using DVI to HDMI. Most likely the card you bought is not a very good one. You may want to make sure you have the latest drivers loaded for that card. I know depending on the drivers, it could cause problems on certain stages ect. I do not even own a high definition card. I use a 6150 nvidia onboard. Seems like it works well.

    I just bought the DVICO Fusion RT Gold. I bought it with an Antennae which works for crap. I actually just hooked it up to my old Philips and get a decent reception with it. I bought a pair of bunny ears from radio shack two years ago which works "almost flawless" with this card. Depending on which station I am watching, there maybe a loss of signal or a studder. I could go out and buy myself a better antennae, but for some reason I think since I am in Toledo Ohio it is not going to make much difference.

    Also, just to add a little more feedback: I do have an HD card in my 42 Inch Samsung Rear Projector 720p which pics up these stations too. I seem to actually get a black screen more commonly with the TV while on the capture card I get more studders in action.
    Quote Quote  
  7. Originally Posted by rvk2
    My monitor supports 720p and 1080i. However, the screen appears rather grainy and text is all blurred in 720p and 1080i.
    I think your HDTV just isn't designed to be used as a computer monitor. It resizes any input to it's native resolution leading to the blurring you're seeing. Some HDTV's have a one-to-one pixel mapping option. That would allow a 1366x768 native resolution HDTV to display a 1280x720 image centered in the display (with small black borders) with no resizing or blurring.

    Originally Posted by rvk2
    And under 1080i, my screen is so large that my start button disappears into oblivion.
    This may simply be an overscan issue. Televisions, even HDTVs, normally display the video frame slightly larger than the visible display. This isn't a problem for video but a computer has important things at the edges -- the start bar, title bars, etc.
    Quote Quote  
  8. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by rvk2
    Yes, I have tried different resolutions. My monitor supports 720p and 1080i. However, the screen appears rather grainy and text is all blurred in 720p and 1080i. And under 1080i, my screen is so large that my start button disappears into oblivion.
    OK: Samsung 26" LCD HDTV (native 1360x768) and ATI Radeon 9200.

    First, the ATI card needs a recent Catalyst driver that supports square pixel wide screen Vesa resolutions. Among them you will find XGA 1024x768 (4:3) and WXGA 1360x768, 1366x768 (~16:9). These are intended to work over the VGA connection. The TV also expects a square pixel Vesa computer display adapter on it's VGA port and usually will not overscan.

    http://en.wikipedia.org/wiki/Wide_XGA
    http://en.wikipedia.org/wiki/VGA

    If you get that working, your HDTV will behave like a 16:9 progressive computer monitor. Normal software video players can be used (e.g. PowerDVD, WinDVD, VLC). Since the display is progressive, you will have the normal issues with interlace inputs.

    The 9200 will not support 1080i over DVI. The line in the sand is the 9550 for support of the ATI HD analog component adapter that enables 480i, 480p, 720p and 1080i output via the DVI-I port to YPbPr. The 9550 up also have better deinterlacers for 480i tuner input but the x1000 series are better still.

    The HDMI and analog component TV ports are problematic for computer display. The main problem is overscan that causes loss of the desktop edge menus. In order to see the picture edge, the desktop needs to be downsized in the Catalyst menus. This blurs the image. IMO the VGA port will give you a sharper picture at 1360x768 and most HDTV sets will not overscan the VGA input.
    Quote Quote  
  9. Member
    Join Date
    Mar 2007
    Location
    United States
    Search Comp PM
    thank you guys, you've been of great help. The VGA resolution looks much better on my tv now that I changed it to 120dpi. I'm still trying to find a decent card that will let me play all my media through PC.
    Quote Quote  
  10. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by rvk2
    thank you guys, you've been of great help. The VGA resolution looks much better on my tv now that I changed it to 120dpi. I'm still trying to find a decent card that will let me play all my media through PC.
    Is this media interlace or progressive? If interlace, you are best off by buying a Nvidia card with "Purevideo" or an ATI card with "AVIVO". VGA vs HDMI won't make much differernce as an issue. Since your display is progressive, deinterlace must happen either in the display card or in the TV.
    Quote Quote  
  11. Member
    Join Date
    Mar 2007
    Location
    United States
    Search Comp PM
    edDV, I did a bit of research and an Nvidia card with Pure Video sounds like the best option. The media in question is mostly DVDs at the moment, but I do play some 720p encoded .mkv files sometimes. I still am not sure what the best way would be to setup things. I currently have a DVD player plugged into the YPbPr slot, and PC connected to the VGA. For watching DVD movies, I don't mind having the standalone DVD player. The video quality is decent but for audio, I have to use the TV speakers. I'd prefer to get rid of the standalone dvd player and play everything through my PC, primarily because my computer speakers are pretty good.
    I was thinking of getting a card that would let me connect my PC to the YPbPr connection, unless anyone can suggest a better way of connecting things.
    Quote Quote  
  12. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by rvk2
    edDV, I did a bit of research and an Nvidia card with Pure Video sounds like the best option. The media in question is mostly DVDs at the moment, but I do play some 720p encoded .mkv files sometimes. I still am not sure what the best way would be to setup things. I currently have a DVD player plugged into the YPbPr slot, and PC connected to the VGA. For watching DVD movies, I don't mind having the standalone DVD player. The video quality is decent but for audio, I have to use the TV speakers. I'd prefer to get rid of the standalone dvd player and play everything through my PC, primarily because my computer speakers are pretty good.
    I was thinking of getting a card that would let me connect my PC to the YPbPr connection, unless anyone can suggest a better way of connecting things.
    Movie DVD's should play ok from the computer but I find quality is superior from stand alone progressive DVD players. I think you will find interlace DVDs play much better from a quality progressive DVD player.

    If your only problem is audio, you can route stereo audio from the DVD player to the computer speakers.

    On that TV, I don't think YPbPr or HDMI offer advantage over VGA. Both will have overscan issues (cropped picture edges).
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!