VideoHelp Forum
+ Reply to Thread
Results 1 to 7 of 7
Thread
  1. I recently built my self a media center PC to use in conjunction with my 52" HDTV/monitor. I used a Geforce4 MX 440 video card with 3 outputs, VGA, S-video and DVI-D. The quality of the S-video output displayed on the TV is questionable and I didn't want to use a separate MPEG Decoder card because of the limitation of playing only supported MPEG files.
    I used the DVI-D output from the Video card to the DVI input on my HDTV. When the display is set to 640x480 the display on the TV is crystal clear and displays as good as a computer monitor but when the resolution is set any higher the quality is only as good as using the S-video. Text is unreadable and Icons are blurry. Any ideas?
    Quote Quote  
  2. Member yoda313's Avatar
    Join Date
    Jun 2004
    Location
    The Animus
    Search Comp PM
    Hello,

    I don't have hdtv but what is the actual resolution of your hdtv set at??? I know dvds are 720x480 (NTSC) which is just above 640x480 so going higher than that may be part of your problem.

    Check your tv manual and your video card for hdtv paramaters. I'm sure there must be a rule of thumb for displaying pc images on a hdtv. See if the hdtv has an input option for monitor resolutions. Good luck.

    Kevin

    --Come to think of it I remember that 1080i/p is the max for hdtv signals so it SHOULD be able to go higher, keep hunting and good luck ---
    Donatello - The Shredder? Michelangelo - Maybe all that hardware is for making coleslaw?
    Quote Quote  
  3. Yes, HDTV resolution is normally 1080i, the video card detects the Max supported resolution of my HDTV somewhere around 1908 I think. What is the difference between I and P?
    Quote Quote  
  4. Member
    Join Date
    Apr 2002
    Location
    Houston, Texas
    Search PM
    There are two types of component video displays; interlaced and progressive. Your HDTVTV can be either I or P or both. Most of the new DVD output progressive component video. You need three high qualilty component cables to connect between you source and display. If you had HDTV you should make good use of component video which is superior to s-video and composite video.
    Quote Quote  
  5. Yes most everyone knows that. I was talking about "DVI" not comonent,composit, S-video and output from a Video card not a stand alone DVD player.
    Quote Quote  
  6. I encounter a similar situation with my Moxi by Digeo DVR (http://www.digeo.com) and Sony 50" LCD RPTV HDTV connected through a component video (analog) cable. The DVR can be configured to output either 480i, 480p, 720p, or 1080i in resolution while the HDTV auto-senses and switches amang these resolutions. If I set DVR to output 480i, SDTV programs show crystal-clear.

    If I set DVR to output 1080i, 1080 HDTV programs show amzingly clear, but SDTV programs become significantly blur. Similar to your case, isn't it?

    Here's my guess.

    I guess this is due to different interpolation algorithms that "promote" 480i signals to 1080i used by the DVR and HDTV. Inside the HDTV, all input programs eventually have to be displayed with a 1080i resolution regardless of the original resolution, unless the HDTV internally adjusts optics which I don't believe the case. This is a difference of LCD/DLP/plasma type TVs that have a fixed number of pixels from CRT monitors that actually have variable numbers of scanning lines.

    Now, converting a 480i input to a 1080i resolution requires making 9 vertically adjacent pixels (i.e. horizontal lines) out of 4 original pixels. I would call this "interpolation." One of the simplest interpolation techniques would be to duplicate each of the first 3 pixels in such a 4 pixel group and duplicate the last pixel twice, resulting in 9 pixels in total. This is simple, but probably not perfect in quality (I don't know how poor). Otherwise, to achive better quality, there has to be very complicated digital signal processig or calculation called "filtering" and there are a variety of different algorithms resulting in different output qualities. I guess the HDTV knows the best interpolation for itself because it knows the overall characterstics of the TV screen and the resulting effects, while my DVR and your video card do just generic simple interpolation resulting only in poor picture quality.

    So, my conclusion is to feed the HDTV with the pgram's original resoultion; if a recorded TV show is 1080i HD, use 1080i, otherwise use 480i for NTSC recordeings and leave the interpolation processing to the TV. Can your video card do this?

    Well, I am not an expert. Ths is just my guess and I may be totally wrong.

    hiro
    Quote Quote  
  7. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    You need to configure your card to output a standard the HDTV will understand. Typically these are
    480i.......720x480 interlaced .....30fps
    480p......720x480 progressive ..60fps
    1080i...1920x1080 interlaced
    and sometimes
    720p....1280x720 progressive (expensive sets)

    Best to use a program called PowerStrip. Be careful using it. You can damage the TV with the wrong settings.
    http://www.digitalconnection.com/support/cliffnotes_17.asp
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!