VideoHelp Forum




+ Reply to Thread
Results 1 to 18 of 18
  1. Member
    Join Date
    Dec 2006
    Location
    United States
    Search Comp PM
    Hi people.

    Just to let you know I'm new here. This is my first post so I would like to say hello to everyone.

    I have a quite a problem which I'm trying to solve. Perhaps someone had this issue already and the question was answered.

    So... Here is my problem.

    I have a Dell Dimension 2.8Ghz, Pentium D with dual core. 2GB of RAM and 256MB Nvidia Video Card. I'm trying to connect this to a 46' Sony Bravia KDL-46S2000 LCD HDTV. This was a success via the VGA port. Everything looks fine, but here is the problem. When I play DVD's, they are blurry like nothing I've ever seen before. I realized that VGA is analog transfer. My HDTV has an HDMI output and my PC also has a DVI output. SO... I ordered a DVI to HDMI Monster cable to see if that is the problem. Well... I connect it and it was even worse. The image was overscanned and it was flickering. Do you think it is the Video Card?

    All I'm trying to do here is to get DVD's playing at 1080i via my PC. As far as the DVI to HDMI... I really don't understand why is it not working. Even on microsoft website is stated that you can connect PC to HDTV via DVI/HDMI cable. Unfortunately it's not working. If it's not working then why they make DVI?HDMI cables?

    Your help would be appreaciated.
    Quote Quote  
  2. Member Krispy Kritter's Avatar
    Join Date
    Jul 2003
    Location
    St Louis, MO USA
    Search Comp PM
    Video output from the PC is usually at your desktop resolution, so your desktop resolution may need to be changed. Most TV's have a limited number of input resolutions and frequencies. You will need to find out what your TV supports, and match your pc output to your TV. For best results, you will want to use the native resolution of your LCD.
    Google is your Friend
    Quote Quote  
  3. Member
    Join Date
    Dec 2006
    Location
    United States
    Search Comp PM
    Maximum resolution supported by the TV is 1366x768. In PC input is 1360x768 60Hz.

    The TV is 720p/1080i. If I'm connecting via VGA, everything is fine. I do get the resolution I need, but it's the DVD playback what is really bothering me. That's why I bought the DVI to HDMI cable. This connection is not working for some reason. I would like to find out what can I do with this.

    I think this problem is far beyond resolution issue as I already tried that. Could Nvidia with Purevideo help?
    Quote Quote  
  4. Always Watching guns1inger's Avatar
    Join Date
    Apr 2004
    Location
    Miskatonic U
    Search Comp PM
    Well for starters, it can only display 1080i after scaling it down to 766 lines, and only through HDMI or component. VGA is 1360 x 768. 720p/1080i don't even enter into the equation when using VGA.

    Second, most (all) software DVD players that I have seen produce soft output. This includes PowerDVD and WinDVD. If you want good quality DVD playback on your Bravia, get a good quality DVD player with HDMI output and hardware upscaling.
    Read my blog here.
    Quote Quote  
  5. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Is the desktop clear with VGA 1360x768 for other than DVD playback? Do documents and stills look similar to your computer monitor set to similar resolution?

    A VGA connection won't handle interlace sources. Everything must be deinterlaced in the software player or in your display card hardware. Try PowerDVD and/or VLC and experiment with deinterlace settings. Progressive movie DVDs should play well. Some display cards default to a somewhat jerky 30 fps rather than 60 fps for movies. For VGA, playback should look similar to what you see on the computer monitor. "Full screen" should scale the playback from 720x480 to 1360x768 (output display resolution).

    Not all display cards can be configured for 1080i. For previous generation Nvidia cards (not "purevideo"), the connection would likely be by analog component connection for 480i or 1080i. The software player would need to support interlace playback with NVidia cards. I haven't set up an Nvidia card recently. Maybe someone can help with the settings.

    DVI out will most likely be progressive only with that card. Check the TV manual and see what resolutions and formats the HDMI input supports. Most likely it will only be 480p, 720p or 1080i. Try 1280x720p @ 59.94 or 60 Hz.
    Quote Quote  
  6. Member
    Join Date
    Dec 2006
    Location
    United States
    Search Comp PM
    Thanks for all the replies.

    Yes, other documents including pictures looks great at 1360x768. I also downloaded some WMV HD clips from microsoft website and they look amazing on the TV. It's only the DVD that is not what I would expect. I might try a different video card.

    I also bought an HDMI DVD player with upscaling to 720p/1080i. Movies are looking much better, but still slight pixelation when you look closer to the screen. Wonder if that is normal.

    I bought the Sony DVP-NS75H.

    http://www.sonystyle.com/is-bin/%3Cwbr%3EINTERSHOP.enfinity/eCS/Store/en/-/USD/SY_Disp...DVD_DVDPlayers
    Quote Quote  
  7. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    That should be a good match. My DVD player always looks better than the computer connection. Mileage will vary depending on the display card and the monitor. 720p would probably be your best connection. For interlace DVD, the deinterlace burden is on the player at 720p.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  8. Member
    Join Date
    Dec 2006
    Location
    Austria
    Search Comp PM
    I have a P4 3.0G with a 256M Nvidia GeForce 6600 video card hooked up with a DVI - HDMI cable to an LG 50PC1R 50" plasma and all played very well (crisp clear image) after i setup nView display Settings to Clone display mode and changed resolution to secondary display (plasma) to 1024x768 / 60Hz resolution that i found work best for my plasma alltho it can go up to 1360x768(PC monitor has 1680x1050 / 60 Hz)

    The only problem I encounter is that my plasma has only 4/3 or 16/9 display modes under HDMI inputs (not like S-Video where i have besides those modes 14/9, Full, Zoom 1,Zoom2...where needless to say image quality is far worse then HDMI) and except HDTV 720p or 1080i files which are displayed in fullscreen,all other movie files eventhough they are in 16/9 format (divx,xvid,mpeg even DVD's played with my PC DVD-RW or mounted in a virtual drive) are displayed either in 4/3 mode with 2 large stripes on the sides or in 16/9 mode with 2 large stripes up and down,compressing the image making for example thicker heads which steals the pleasure of fully enjoying such an awsome display

    I guess it has to do with upscaling from a smaller resolution to 720 / 1080 but i couldn't find any software that can do this.

    I use only my PC as video source for the plasma,so if anyone can point me to some software that can solve this problem i would apreciate it very much
    Quote Quote  
  9. Member
    Join Date
    Jan 2007
    Location
    Egypt
    Search Comp PM
    Hi there.

    I'm having a similar problem here, I got a 26' Philips HD ready LCD widescreen flat tv.
    The last few days i've been using the TV as a PC monitor, all is good with VGA at a 1280x768. Today I got a DVI 400 Monster cable, connected everything, started myPC and got into windows with no problem accept the screen is all blury and is flickering.
    I changed the resolution to 1600x1200 same thing and also did this Nvidia TV setup wizard thing that didn't help at all.

    I've got a GeForce 7800GTX graphics card with DVI output, 2GB's of ram and a AMD athlon 64 3500+ CPU.

    I'm really not sure what the problem is and would really like to use this TV to its full HD potential. I'm a gamer and like things cranked up to full settings and nice high resolutions.

    Anyone got any idea's ?

    Reverting back to VGA for the moment.

    Thanks
    Quote Quote  
  10. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Over DVI/HDMI most LCD TV sets will be expecting 480p, 720p or in some cases 1080i/1080p. 1080i must be deinterlaced in the HDTV before progressive display.

    Native display resolutions for most LCD-TV are 1280x768 or 1366x768, so 1280x720p is the closest match.

    The display card needs to be set to 720p in the various TV modes depending on the card (e.g. ATI "Theater or AVIVO", NVidia "PureVideo") for best DVI/HDMI match. You need to read the support info for your display card to get in the correct modes for 720p/1080i out.

    If you are looking to use the TV as a computer monitor, overscan becomes an issue. Most HDTV sets overscan over the DVI/HDMI connector, forcing the need to zoom the desktop back to see the edges. This mode will be in the display card settings. This zoom (downscale) will degrade the display crispness somewhat.

    For the above reasons and to make computer/game interface easier for the average Joe, many LCD-TV sets have VGA connections that behave like a typical computer monitor responding to standard VESA extended XGA resolutions like 1024x768, 1280x768 or sometimes SXGA 1280x1024 (see the TV manual). These usually do not overscan but some monitors do. To get square pixel 16:9, special WXGA resolutions have been released by the display card vendors (e.g. 1280x768, 1360x768 or 1366x768). Use the one that best matches the HDTV VGA input specs. That will ideally get you a Windows XP desptop in 16:9 without overscan. Media players will display DVD as 720x480 or 1280x720 windows on the desktop that can be upscale displayed with "Full Screen" mode just like a standard computer monitor.
    Quote Quote  
  11. I have my 42" Samsung setup with a 256 meg agp video card connected over VGA. I was quite certain I would get SOME loss but wasn't sure how much. Especially when I watch TV shows on it.

    One thing that greatly improved my quality of what I watched from the PC is getting the high resolution files of the shows. Or higher bitrate files. It was a huge increase in what I watched. Try testing that out...
    Quote Quote  
  12. Member
    Join Date
    Jan 2007
    Location
    Egypt
    Search Comp PM
    Hi again.

    I talked to one of my tech buddies and found out there are many different types of DVI cables, i got a DVI-D cable when i should have either a DVI-I or DVI-A cable.

    Heres a guide for those that want to find out more. http://www.directron.com/dviguide.html
    Quote Quote  
  13. Член BJ_M's Avatar
    Join Date
    Jul 2002
    Location
    Canada
    Search Comp PM
    DVI-A is just an analog connection no different than VGA really ... you might as well just use the vga connector
    "Each problem that I solved became a rule which served afterwards to solve other problems." - Rene Descartes (1596-1650)
    Quote Quote  
  14. I am glad this thread came up. I was reading the previous post about how it converts the video from digital to analog.

    Which would be a better connector: VGA or DVI? (no converting dvi to hdmi or vga to ..)
    Quote Quote  
  15. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by Blue
    Hi again.

    I talked to one of my tech buddies and found out there are many different types of DVI cables, i got a DVI-D cable when i should have either a DVI-I or DVI-A cable.

    Heres a guide for those that want to find out more. http://www.directron.com/dviguide.html
    You probably don't need a different cable.
    The digital input on the TV is DVI-D or HDMI. HDMI is DVI-D + audio and control.

    DVI-I also contains DVI-A which equals the VGA pins. Usually an adapter to VGA comes with the card. That then gets plugged into the VGA connector on the TV.

    So the connection choices come down to:

    Composite (yellow) for NTSC/PAL
    S-Video for NTSC/PAL

    Analog components (YPbPr) for 480i, 576i, 480p, 576p, 720p, 1080i
    VGA* for VESA computer resolutions like 1024x768, 1280x768, 1366x768
    DVI-D/HDMI* for 480i, 576i, 480p, 576p, 720p, 1080i and sometimes 1080p

    * Note that HDTV sets only support a few resolutions and refresh rates as listed in the manual. They are not multi-sync computer monitors. When it says 720p, it means 1280x720p/59.94fps or 50fps

    http://en.wikipedia.org/wiki/DVI
    http://www.bluejeanscable.com/store/dvi/index.htm#dvianalog
    Quote Quote  
  16. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    According to this site, you should never use a DVI-I cable because they physically won't mate with a DVI-D connector (as found on some HDTV sets).
    http://www.playtool.com/pages/dvicompat/dvi.html

    There is other good info on this site but he is talking about computer monitors, not HDTV sets. Most HDTV sets don't do 1:1 pixel mapping but instead process all DVI inputs for aspect ratio - display modes, etc.

    The issues above show why VGA may be the superior computer connection method for most medium resolution HDTV sets. It is certainily less of a hassle.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  17. Member
    Join Date
    Jan 2007
    Location
    Egypt
    Search Comp PM
    Today i'm gonna try my luck with the TV's composite HD ports, you know the red, green and blue.
    Quote Quote  
  18. Член BJ_M's Avatar
    Join Date
    Jul 2002
    Location
    Canada
    Search Comp PM
    those are called component (or RGB or YPbPr) , not composite
    "Each problem that I solved became a rule which served afterwards to solve other problems." - Rene Descartes (1596-1650)
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!