VideoHelp Forum

Our website is made possible by displaying online advertisements to our visitors. Consider supporting us by disable your adblocker or Try ConvertXtoDVD and convert all your movies to DVD. Free trial ! :)
+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 41
Thread

Threaded View

  1. Member Foil's Avatar
    Join Date
    Apr 2008
    Location
    Denver, CO, United States
    Search Comp PM
    Hi,

    I recently rebuilt one of my machines to be a media center and I am trying to connect it to my Panasonic 1080p HDTV using a DVI-to-HDMI cable, but I'm having a problem getting it to work (black screen on TV, indicator shows no signal).

    Here's the thing, I hooked this same machine up just a couple of weeks ago, using the same cable, and had no problems getting it to run at 1920x1080. The only thing I've changed is the video card; I switched the 8800 I was previously using (which went to another machine) for a working x850XT.

    I know the DVI output on the x850XT is fine, because it hooks up to my monitor upstairs via DVI-to-VGA and works correctly. I have all the correct drivers, the BIOS settings are the same, and the cable is okay, but I don't get a signal at all while it's connected to the TV. It has me stumped.

    ...Any suggestions?
    Quote Quote  
  2. aBigMeanie aedipuss's Avatar
    Join Date
    Oct 2005
    Location
    666th portal
    Search Comp PM
    the card is too old. it doesn't support hdcp so the best it would hook up at is probably 720.
    --
    "a lot of people are better dead" - prisoner KSC2-303
    Quote Quote  
  3. Member Foil's Avatar
    Join Date
    Apr 2008
    Location
    Denver, CO, United States
    Search Comp PM
    I'm not sure I understand... why does the fact that the x850xt is not HDCP cause this problem?

    According to its specs, it supports all the needed resolutions (including the 1920x1080 I previously used).
    Quote Quote  
  4. aBigMeanie aedipuss's Avatar
    Join Date
    Oct 2005
    Location
    666th portal
    Search Comp PM
    big brother doesn't want any chance of HD signals being copied.
    http://en.wikipedia.org/wiki/HDCP
    --
    "a lot of people are better dead" - prisoner KSC2-303
    Quote Quote  
  5. Member Foil's Avatar
    Join Date
    Apr 2008
    Location
    Denver, CO, United States
    Search Comp PM
    Thanks for the link; I fully understand why it would be a problem for an HDCP signal to be displayed on a non-HDCP display.

    But that isn't the case here, it's the other way around. The display (my new HDTV) supports HDCP if needed, but the source signal from my x850XT doesn't even use it. Why would that be a problem?

    [Edit: The problem is not that I can't play HDCP content, it's that I can't even get a signal at this point.]
    Quote Quote  
  6. aBigMeanie aedipuss's Avatar
    Join Date
    Oct 2005
    Location
    666th portal
    Search Comp PM
    can you try it at 720 60Hz?
    --
    "a lot of people are better dead" - prisoner KSC2-303
    Quote Quote  
  7. Member
    Join Date
    Jun 2002
    Location
    Redding, California
    Search Comp PM
    Originally Posted by Foil
    I know the DVI output on the x850XT is fine, because it hooks up to my monitor upstairs via DVI-to-VGA and works correctly. I have all the correct drivers, the BIOS settings are the same, and the cable is okay, but I don't get a signal at all while it's connected to the TV. It has me stumped.

    ...Any suggestions?
    The DVI from the x850XT has 4 analog pins to be used for VGA which is analog.

    Are there any settings to switch the DVI output from analog to digital out?
    Quote Quote  
  8. Member Foil's Avatar
    Join Date
    Apr 2008
    Location
    Denver, CO, United States
    Search Comp PM
    Originally Posted by aedipuss
    can you try it at 720 60Hz?
    *Sigh* You're still not understanding.

    If I could get to the graphics settings, of course I'd try all the available resolutions and refresh rates.

    However, as I've said at least twice now, I can't get anything on the HDTV. No boot screen, no BIOS setup, no OS, nothing. So there's no way for me to try to adjust resolutions and refresh rates at all.
    Quote Quote  
  9. Member Foil's Avatar
    Join Date
    Apr 2008
    Location
    Denver, CO, United States
    Search Comp PM
    Originally Posted by Megahurts
    The DVI from the x850XT has 4 analog pins to be used for VGA which is analog.

    Are there any settings to switch the DVI output from analog to digital out?
    I haven't seen any, but that could certainly be it. If there is such a setting, would it be in the card's BIOS or something?

    I'll take a look and see what I can find.

    [Edit: I just verified that the x850XT's DVI is "DVI-I" (supports the four analog pins). The DVI-to-HDMI cable doesn't have those pins, so it looks like you're probably correct. Heck, maybe it's a jumper on the card or something. Thanks for the tip!]
    Quote Quote  
  10. Member Foil's Avatar
    Join Date
    Apr 2008
    Location
    Denver, CO, United States
    Search Comp PM
    Well, I couldn't find any digital/analog settings, at least in the mb-BIOS or any jumpers or such on the card itself. Any other ideas, guys?

    I did find an SVideo-to-component(YPrPb) adapter that came with the card, but I haven't tried it yet. Any idea if that could carry a full 1080p signal?

    Thanks in advance for your help!
    Quote Quote  
  11. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Your Panasonic 1080p HDTV should list the resolutions and frame rates accepted in the manual or at the support web site. There will be separate lists for the HDMI and VGA inputs. The VGA inputs are usually a no-brainer and result in an image with no overscan. This is particularly useful for desktop or game display.

    HDMI inputs nearly always overscan and usually accept a more restricted list of resolutions. In all cases 60Hz framerates should be used.

    ATI cards of that vintage generally assumed analog component for HD connection. The card either came with an analog component cable or used an adapter connected to a DVI-I out. The card internally switched from VGA RGB mode to 1080i/720p/480p YPbPr when the adapter was used.

    HD 1080p requires a top end card with good MPeg2 and/or MPeg4 decoding and heavy duty deinterlace performance for 480i/1080i source.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  12. Member Foil's Avatar
    Join Date
    Apr 2008
    Location
    Denver, CO, United States
    Search Comp PM
    Hmm, thanks for the info.

    I've been avoiding using the VGA input, as this model (PT-56LCZ70)happens to be limited to something like 1280x1024 using that input. I may end up doing that, though.

    Otherwise, it looks like I'll need to use the SVideo-to-YPrPb adapter that came with the card (which seems to be the one intended for HDTV connections), although that means I'll have to switch things around because I'm already out of component inputs.

    I'll post tomorrow when I can get back to this project.

    Thanks again, guys!
    Quote Quote  
  13. Originally Posted by edDV
    HD 1080p requires a top end card with good MPeg2 and/or MPeg4 decoding and heavy duty deinterlace performance for 480i/1080i source.
    A US$75 nvidia 8500GT is sufficient. I use an 8600GT via DVI->HDMI cable, 1080p60.
    Quote Quote  
  14. Member Foil's Avatar
    Join Date
    Apr 2008
    Location
    Denver, CO, United States
    Search Comp PM
    Originally Posted by jagabo
    A US$75 nvidia 8500GT is sufficient.
    If I had the extra $75, yeah, I'd do that in a heartbeat. But I don't; I'm just using components that I already own for now.

    (I'm aware I won't be able to play HDCP content with this card, but that's fine for now, since I won't be able to get a Blu-Ray drive for a while, either.)
    Quote Quote  
  15. Member
    Join Date
    May 2007
    Location
    reality
    Search Comp PM
    Originally Posted by Foil
    Hmm, thanks for the info.

    I've been avoiding using the VGA input, as this model (PT-56LCZ70)happens to be limited to something like 1280x1024 using that input. I may end up doing that, though.

    Otherwise, it looks like I'll need to use the SVideo-to-YPrPb adapter that came with the card (which seems to be the one intended for HDTV connections), although that means I'll have to switch things around because I'm already out of component inputs.

    I'll post tomorrow when I can get back to this project.

    Thanks again, guys!
    That is not an S-Video to component video adapter. ATI uses a DIN type connector on their cards for video in and video out to save real estate. If you connect the component dongle you get component HD, if you connect the composite/S-Video dongle you get composite/S-Video. It may look like an S-Video port but it is not. Also, if the component dongle has an additional orange connector it is a coaxial S/PDIF out for Dolby Digital or DTS direct to your surround sound decoder/amplifier.
    Quote Quote  
  16. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by Foil
    Originally Posted by jagabo
    A US$75 nvidia 8500GT is sufficient.
    If I had the extra $75, yeah, I'd do that in a heartbeat. But I don't; I'm just using components that I already own for now.

    (I'm aware I won't be able to play HDCP content with this card, but that's fine for now, since I won't be able to get a Blu-Ray drive for a while, either.)
    I consider the 8500GT to be a high end card although the 8600GT has more video features. The x800 would be a low end card.

    You still may be able to connect on the HDMI port if you use the resolutions specified in the TV manual.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  17. Member
    Join Date
    Mar 2003
    Location
    Edinburgh
    Search Comp PM
    What about going back to basics, bring your monitior down to your pc, tv. keep the hdmi plugged in and when it boots up on your monitor, make sure the desktop is cloned. it maybe just as simple that the default output is dvi.
    Quote Quote  
  18. Originally Posted by edDV
    I consider the 8500GT to be a high end card
    I think most people consider a high end graphics card to be a ~$500 card. $75 is low end, but not bottom fishing.
    Quote Quote  
  19. Member Foil's Avatar
    Join Date
    Apr 2008
    Location
    Denver, CO, United States
    Search Comp PM
    Originally Posted by John
    What about going back to basics, bring your monitior down to your pc, tv. keep the hdmi plugged in and when it boots up on your monitor, make sure the desktop is cloned. it maybe just as simple that the default output is dvi.
    I had that idea too, although I hadn't tried it because I don't know of a way to change the 'default' output. I can set the 'primary monitor' in Windows, but that wouldn't fix the problem that I can't get any dvi output at all before the OS even loads. [I'll try it, though.]

    P.S. You guys have been extremely helpful, more than I expected.
    Quote Quote  
  20. Member Foil's Avatar
    Join Date
    Apr 2008
    Location
    Denver, CO, United States
    Search Comp PM
    Originally Posted by edDV
    You still may be able to connect on the HDMI port if you use the resolutions specified in the TV manual.
    That's the thing. I already know I can connect at 1920x1080x60Hz to this TV via the HDMI port I'm using (I did it with my 8800), and I know the x850XT supports that resolution.

    But there's no way for me to play with resolutions and refresh rates when I can't get it to connect in the first place.
    Quote Quote  
  21. Member Foil's Avatar
    Join Date
    Apr 2008
    Location
    Denver, CO, United States
    Search Comp PM
    Originally Posted by Video Head
    That is not an S-Video to component video adapter.
    ...It may look like an S-Video port but it is not.
    Ah, thanks for the info!

    It didn't quite make sense to me that an S-video signal could be converted to component by a dongle, so that explains it.

    That brings up a question for me, then: If I can get both the DVI and component outputs working, which one is preferable, from your experience?
    Quote Quote  
  22. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by jagabo
    Originally Posted by edDV
    I consider the 8500GT to be a high end card
    I think most people consider a high end graphics card to be a ~$500 card. $75 is low end, but not bottom fishing.
    I'm not a serious gamer so I look at these cards for their video display capability rather than unused 3D rendering capability. In that regard (e.g. PureVideo features) the 8600GT becomes upper middle to low high.
    http://www.nvidia.com/page/purevideo_support.html
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  23. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    [quote="Foil"]
    Originally Posted by edDV

    But there's no way for me to play with resolutions and refresh rates when I can't get it to connect in the first place.
    You can in dual monitor mode. Use the computer monitor to set the second display TV modes.

    Is your goal to use the TV as your only monitor? In that case VGA without overscan makes more sense. HDMI and analog component will overscan.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  24. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by Foil
    Originally Posted by Video Head
    That is not an S-Video to component video adapter.
    ...It may look like an S-Video port but it is not.
    Ah, thanks for the info!

    It didn't quite make sense to me that an S-video signal could be converted to component by a dongle, so that explains it.

    That brings up a question for me, then: If I can get both the DVI and component outputs working, which one is preferable, from your experience?
    The connector is a 7 to 9 pin Din "TV port"*. ATI and NVidia differ in pinout but support composite, S-Video, and sometimes analog component Y,Pb,Pr on those pins. Pinout differs card to card for both so refer to documentation.

    DVI-HDMI generally has advantage at the cheap consumer level since analog requires expensive parts for highest quality. At the pro level analog component and SDI are common. DVI-HDMI are not used because they are limited to short cable lengths.


    * If the connector only has 4 pins, it is an S-Video connector.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  25. Member Foil's Avatar
    Join Date
    Apr 2008
    Location
    Denver, CO, United States
    Search Comp PM
    Originally Posted by edDV
    Is your goal to use the TV as your only monitor? In that case VGA without overscan makes more sense. HDMI and analog component will overscan.
    Yes, I plan to use the TV as the only monitor for that machine.

    As I said before, the VGA for this TV model is limited to 1280x1024, so I'd prefer to go with the higher resolution from DVI/HDMI if I can. When I did this before with the 8800GTS card, I was able to adjust for the overscan.

    My main question now is: assuming I can get both the DVI and component outputs working, which is best?
    Quote Quote  
  26. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by Foil
    Originally Posted by edDV
    Is your goal to use the TV as your only monitor? In that case VGA without overscan makes more sense. HDMI and analog component will overscan.
    Yes, I plan to use the TV as the only monitor for that machine.

    As I said before, the VGA for this TV model is limited to 1280x1024, so I'd prefer to go with the higher resolution from DVI/HDMI if I can. When I did this before with the 8800GTS card, I was able to adjust for the overscan.

    My main question now is: assuming I can get both the DVI and component outputs working, which is best?
    Probably DVI. Try both. Your menus will be chopped on all four sides. The Catalyst control panel may allow scaling the desktop but size scaling reduces quality probably more than using the VGA at 1280x1024. Try all three and use what looks best to you.

    Are you certain this TV is native 1080p? Samungs allow 1920x1080p over VGA. They also have a mode to defeat overscan on HDMI called "Just Scan".
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  27. Member Foil's Avatar
    Join Date
    Apr 2008
    Location
    Denver, CO, United States
    Search Comp PM
    Originally Posted by edDV
    Are you certain this TV is native 1080p? Samungs allow 1920x1080p over VGA.
    Yep, I'm sure. It's native 1080p, but limited to 1280x1024 with the VGA input.

    Originally Posted by edDV
    They also have a mode to defeat overscan on HDMI called "Just Scan".
    I don't remember any features like that on this set, but I'll check. Thanks for the tip.
    Quote Quote  
  28. Originally Posted by Foil
    Originally Posted by edDV
    They also have a mode to defeat overscan on HDMI called "Just Scan".
    I don't remember any features like that on this set, but I'll check. Thanks for the tip.
    It might be called pixel-for-pixel mode, 1:1 pixel mapping, or something similar. I know at least some Panasonic 1080p displays have it.
    Quote Quote  
  29. Member
    Join Date
    May 2007
    Location
    reality
    Search Comp PM
    Originally Posted by Foil
    Originally Posted by Video Head
    That is not an S-Video to component video adapter.
    ...It may look like an S-Video port but it is not.
    Ah, thanks for the info!

    It didn't quite make sense to me that an S-video signal could be converted to component by a dongle, so that explains it.

    That brings up a question for me, then: If I can get both the DVI and component outputs working, which one is preferable, from your experience?
    I have used both and can not see a difference. But try both for yourself.
    Quote Quote  
  30. Member
    Join Date
    May 2007
    Location
    reality
    Search Comp PM
    Originally Posted by edDV
    Originally Posted by Foil
    Originally Posted by Video Head
    That is not an S-Video to component video adapter.
    ...It may look like an S-Video port but it is not.
    Ah, thanks for the info!

    It didn't quite make sense to me that an S-video signal could be converted to component by a dongle, so that explains it.

    That brings up a question for me, then: If I can get both the DVI and component outputs working, which one is preferable, from your experience?
    The connector is a 7 to 9 pin Din "TV port"*. ATI and NVidia differ in pinout but support composite, S-Video, and sometimes analog component Y,Pb,Pr on those pins. Pinout differs card to card for both so refer to documentation.

    DVI-HDMI generally has advantage at the cheap consumer level since analog requires expensive parts for highest quality. At the pro level analog component and SDI are common. DVI-HDMI are not used because they are limited to short cable lengths.


    * If the connector only has 4 pins, it is an S-Video connector.
    If the component dongle came packaged with the video card one can only assume it is the correct one...
    Quote Quote  



Similar Threads