VideoHelp Forum
+ Reply to Thread
Results 1 to 20 of 20
Thread
  1. Member
    Join Date: Apr 2007
    Location: United Kingdom
    Search Comp PM
    Hi There

    I have scoured this website and have found it very useful and informative, however the many topics and replies I have read do answer most of my questions but I still have a few I need answering so I may as well start from the beginning and would most appreciate any reply to this.

    I am currently looking to buy a new computer but own a newly purchased Sony Bravia KDL-40W2000 LCD display - its inputs are listed on the Sony website - http://www.sony.co.uk/view/ShowProduct.action?product=KDL-40W2000&site=odw_en_GB&pageT...ory=TVP+LCD+TV

    But just to recap they are

    4 Pin (Y/C) In - S-Video

    MiniJack (Head/Earphone) (mm) 3.5

    PCMCIA Card Slot

    RCA Audio Out

    RCA AV Input Composite

    RF In YES

    Scart 1 (RGB)

    Scart 2 (RGB, Smartlink)

    HDMI x 2

    Component Video + Audio

    PC Input (15pin D-Sub) + Audio In

    My question is I am looking to use my LCD display as a 2nd monitor and wanted to know the best output I could expect from a graphics card - please note the display does not have DVI in however you can get a DVI-HDMI adaptor, because it is an adaptor does this somehow decrease the quality of the signal? anyway, the display can handle 1080p at 1920x1080 and I am looking for the bext possible quality to output to this display, is HDMI out on the graphics card gonna be the bext option? as DVI does not carry audio I am slightly reluctant to do this as I will be running more cables from sound card to LCD display (do majority of soundcards have the neccessary output to go into the inputs listed above?) - would buying an HDMI out graphics card be the best option as this will carry audio and presumably be a direct digital connection? do certain motherboards do better with this type of thing - i.e. some run HDMI out through a seperate graphics card better than the next motherboard? Please bear in mind that I wont be running a great deal of 1080p through the computer however I would like to buy for the future and try and avoid upgrading a graphics card to accomodate future specs.

    I know this is alot of questions but would be very grateful if someone could point me in the right direction.


    thanks

    Andy
    Quote Quote  
  2. Member MJA's Avatar
    Join Date: Jan 2005
    Location: IL
    Search Comp PM
    if you can find the MSI NX7600GT Diamond Plus GeForce 7600GT it got HDMI port,but if money isn't an issue its better to get the GeForce 8800

    http://www.newegg.com/Product/ShowImage.aspx?Image=14-127-223-07.jpg%2c14-127-223-02.j...+Card+-+Retail
    Quote Quote  
  3. Member edDV's Avatar
    Join Date: Mar 2004
    Location: Northern California, USA
    Search Comp PM
    I'm sure you realize that a TV will overscan on the HDMI port. You need to read the manual to find what resolutions are supported on the port. Normal is 1280x720p or 1920x1080i. Some will accept a computer generated 1920x1080p but will display an overscanned image (~center 90-95%). This causes GUI desktops to lose their edge menus unless the entire desktop is reduced in size to fit the edges of the overscanned image. Most high end video cards (ATI AVIVO, Nvidia Purevideo) will do a quality scale but regardless, the image will be softened in the downscale.

    This is the reality of using a TV as a computer monitor. The other major issue would be flicker if 1080i was the only interface option. Deflicker filters will soften the picture more.

    For the above reasons, current trend is to offer a VGA "computer and game" port on many of these HDTV sets that emulates a computer monitor (e.g. square pixels and no overscan). This solution is offered with a selected list of allowed resolutions (see manual) in the wide WXGA (e.g. 1280x768, 1366x768) or sometimes WSXGA (1440x900) or maybe WUXGA (1920x1200 or a cropped 1920x1080).

    If the above is confusing, it just shows that computer monitors obey one set of rules and HDTV sets behave differently.

    http://en.wikipedia.org/wiki/Image:Video_Standards.svg
    Quote Quote  
  4. The Old One SatStorm's Avatar
    Join Date: Aug 2000
    Location: Hellas (Greece), E.U.
    Search Comp PM
    I use the Component Video + Audio for this. IMO it looks excellent. On the other hand, my TV is a Samsung M71 LCD 40" (HD ready, not true HD).
    Quote Quote  
  5. Member
    Join Date: Apr 2007
    Location: United Kingdom
    Search Comp PM
    Thankyou very much for your replies - because i'm a little new to HDTV and its wonders I will have to break down your passage so I may understand it a little better - sorry to be a pain.

    I'm sure you realize that a TV will overscan on the HDMI port. You need to read the manual to find what resolutions are supported on the port. Normal is 1280x720p or 1920x1080i. Some will accept a computer generated 1920x1080p but will display an overscanned image (~center 90-95%). This causes GUI desktops to lose their edge menus unless the entire desktop is reduced in size to fit the edges of the overscanned image. Most high end video cards (ATI AVIVO, Nvidia Purevideo) will do a quality scale but regardless, the image will be softened in the downscale. I'm assuming you are referring to the HDMI port on the TV - if so the TV can handle 1920x1080P - so surely no downscale will take place? but even if this is the case.....If the source material is just usual video rubbish and not even anywhere near the resolution the TV can handle, i.e. random downloaded files from bearshare etc. surely it will have to upscale the source?? which is irrelevant anyway as you can't turn an average quality source into a high res output?

    This is the reality of using a TV as a computer monitor. The other major issue would be flicker if 1080i was the only interface option. Deflicker filters will soften the picture more.
    again, the HDMI can handle 1080P - would this be an issue if the source is alot lower than this anyway?

    For the above reasons, current trend is to offer a VGA "computer and game" port on many of these HDTV sets that emulates a computer monitor (e.g. square pixels and no overscan).
    The TV has a standard VGA input with a little audio input right next to it, is this what you are referring to?

    This solution is offered with a selected list of allowed resolutions (see manual) in the wide WXGA (e.g. 1280x768, 1366x768) or sometimes WSXGA (1440x900) or maybe WUXGA (1920x1200 or a cropped 1920x1080).
    I am looking at the TV manual and the highest res the VGA port can take is 1360x768 WXGA - it has a big table of supported res's, this is the highest but also there are some comments just below - I shall type them....

    "This TV's PC input does not support Sync on green or Composite Sync."
    "This TV's PC input does not support interlaced signals"
    "For the best picture quality, it is recommended to use the signals (boldfaced) in the above chart with a 60 Hz vertical frequency from a personal computer. In plug and play, signals with a 60 Hz vertical frequency will be selected automatically"
    If the above is confusing, it just shows that computer monitors obey one set of rules and HDTV sets behave differently.

    May be a bit irrelevant what I have just typed in regarding the VGA port - my decision still lies about whether I buy a graphics card that I just connect to the VGA port, as the resolution it handles is not really high, but would you say its still fairly good quality? or should I go for a full blown card that has HDMI out and go with that as it can handle 1080P input - one final question, if HDMI out on the graphics card is the way forward - why are there hardly any HDMI cards on the market, should I just go down the DVI-out from the card and then through a DVI-HDMI adaptor...will this affect the quality?

    Sorry about all these questions, I hope I have given you enough to go on, I really appreciate all the help everyone has given me on this.
    Quote Quote  
  6. I run a Nvidea 6800 /256megs on a PCI-Express bus. My video card delivers a true 1080I HD signal when you choose the alternate video resolution tab in Nvidea control panel.

    I am running it on a Philips 50" Plasma HD using an HDMI cable and a Philips brand DVI-HDMI adaptor (13.00).

    There is no.......I repeat...........NO flicker, tearing or distortion of any kind. It is a true quality experience and I still can't get over how amazing it is to play Battlefield 2 on a Plasma screen that big.

    Nvidea does provide many resolution choices in I and P configurations. They also allow you to trim the signal at screens edge, shrink it or expand it as well.

    If your setup ends up giving you overscan or flicker problems, they have adjustments in the panel for that as well. I never had to use them, but they are there.

    If I was forced to find any problem at all, itt would be that when the desktop is centered on the screend....you lose a bit of the desktop on both sides. Just enuf to notice it and not enuf to adjust for.

    It rocks.............Just remember.......If you have a LCD screen, tearing is possible whenever anything on the screen moves rapidly.........like in a first person shooter.

    Plasma doesn't have that handicap fortunately but with plasma screens this big and vivid.........it don't take very long at alol to get image burn if you leave it on and there is nothing moving in the picture. Scary thought.......considering how much it cost.

    HDMI is not being marketed on video cards because it is not what HDMI was designed for.............HDMI is intended to stop piracy of rented video over the cable and Sat disc service as well.

    It was never intended for anything but HD TV........between the HD decoder and the monitor. The Hd decoder being the cable box, a dvr or HD DVD player.

    Besides HDMI is video and audio, all in one......PC's have never followed that concept and engineers rarely specify standards that cost a lot more to include and must be halfway disabled to boot.

    DVI just makes more sense for PC use......Cheaper, easier to upgrade vid and audio separately,even a tougher connection point at cable ends. Those HDMI cable ends are not very tough and pull out easy.

    Ccompanies that license HDMI and have to figure out how to defer all that research money involved would find it insane to use it on equipment that can't justify the extra expense for nill performance gain.

    Use the adapter and a good digital audio cable for surround sound output from your audio card. It is the best of both worlds and it won't take a bank loan to get set up...............Enjoy!

    Winter
    Quote Quote  
  7. Member edDV's Avatar
    Join Date: Mar 2004
    Location: Northern California, USA
    Search Comp PM
    A quick note about TV programming, games and computer GUI. This may be a source of some confusion.

    1. TV programming (and movies) always assume some overscan. The edges are most always ragged and need to be hidden.

    CRT TV sets usually overscan 10%, LCD/Plasmas use more in the 3-7% range. The picture on the right exaggerates VCR jitter but does show vertical interval data pulses and head switch near the bottom of the picture.
    http://en.wikipedia.org/wiki/Overscan



    2. TV graphics, guide menus and game menus assume heavy CRT overscan. They are never placed near the frame edges.

    3. Wide movie aspect ratios are often side cropped beyond normal overscan to give more vertical height.

    4. Computer desktop GUI's use all the frame space. Menus are placed around the edges. Computer monitors have sizing controls to adjust display for zero overscan. TV sets usually don't. Overscan is usually a fixed amount for analog and HDMI inputs.
    Quote Quote  
  8. Member edDV's Avatar
    Join Date: Mar 2004
    Location: Northern California, USA
    Search Comp PM
    Originally Posted by arn2153
    ...
    I'm assuming you are referring to the HDMI port on the TV - if so the TV can handle 1920x1080P - so surely no downscale will take place? but even if this is the case.....If the source material is just usual video rubbish and not even anywhere near the resolution the TV can handle, i.e. random downloaded files from bearshare etc. surely it will have to upscale the source?? which is irrelevant anyway as you can't turn an average quality source into a high res output?
    If a computer is used to feed the TV, upscale will occur in the computer display card. If the card is set to output 1920x1080p, that is what is sent to the HDTV HDMI port. The HDTV will receive 1920x1080 but then expand it ~5% to ~2016x1134 and then display the center 1920x1080 of that. This is called overscan. If you were sending a windows desktop as video, the lower menu and other edges will now be off screen. For video or movie source you won't notice much difference.

    Originally Posted by arn2153
    ...
    again, the HDMI can handle 1080P - would this be an issue if the source is alot lower than this anyway?
    No. If you send 1280x720p to the TV, it will overscan that ~5% as well. Same with 720x576p DVD input.

    Originally Posted by arn2153
    ...
    The TV has a standard VGA input with a little audio input right next to it, is this what you are referring to?
    The TV makers call this the "PC or computer or game port" and list the computer resolutions supported. 1366x768, 1280x768 and 1024x768 are typical. They usually don't overscan on this port.

    Originally Posted by arn2153
    ...
    I am looking at the TV manual and the highest res the VGA port can take is 1360x768 WXGA - it has a big table of supported res's, this is the highest but also there are some comments just below - I shall type them....

    "This TV's PC input does not support Sync on green or Composite Sync."
    "This TV's PC input does not support interlaced signals"
    "For the best picture quality, it is recommended to use the signals (boldfaced) in the above chart with a 60 Hz vertical frequency from a personal computer. In plug and play, signals with a 60 Hz vertical frequency will be selected automatically"
    If the above is confusing, it just shows that computer monitors obey one set of rules and HDTV sets behave differently.
    So they are saying VGA from a PC or XBox360 will work at the suggested resolutions.
    Quote Quote  
  9. Member edDV's Avatar
    Join Date: Mar 2004
    Location: Northern California, USA
    Search Comp PM
    Originally Posted by arn2153
    ...
    May be a bit irrelevant what I have just typed in regarding the VGA port - my decision still lies about whether I buy a graphics card that I just connect to the VGA port, as the resolution it handles is not really high, but would you say its still fairly good quality? or should I go for a full blown card that has HDMI out and go with that as it can handle 1080P input - one final question, if HDMI out on the graphics card is the way forward - why are there hardly any HDMI cards on the market, should I just go down the DVI-out from the card and then through a DVI-HDMI adaptor...will this affect the quality?
    The VGA port is the easy way to go. Most current cards have VGA and DVI-D out.

    1080p over DVI-D is possible on some. A DVI-D to HDMI cable can be used. If you get the newest "AVIVO" or "PureVideo" cards, they will have provision to shrink the desktop back down to counter the overscan at the TV. The issue would be after all the downscale + upscale would the resulting picture resolution be any sharper than 1366x768 that is not scaled.

    HDMI is not likely to replace DVI on computer cards. It looks like that will be the "Display Port". Yet another cable to buy.
    http://forum.videohelp.com/viewtopic.php?t=326971&highlight=


    PS: If my attempted explanation is still unclear, this guy did a good job and included pictures.
    http://www.highdefinitionblog.com/?page_id=127
    Quote Quote  
  10. Member oz_surfer's Avatar
    Join Date: Jul 2006
    Location: Australia
    Search Comp PM
    I have a Sapphire X1600PRO HDMI card (ATI)

    WOW!!!
    Quote Quote  
  11. Hello

    Hopefully someone from this original post is still around. I have a KDL-40W2000 also and am trying to send through a PC signal to the HDMI IN 4 port using a DVI to HDMI adaptor.

    I am getting absolutely no signal through to the TV, not even a flicker.

    Can arn2153 confirm how he got on?
    Quote Quote  
  12. Member
    Join Date: Mar 2003
    Location: Los Angeles
    Search Comp PM
    I too am having similar problems. I just bought this:

    http://www.samsung.com/Products/TV/LCDTV/LNS2651DXXAA.asp

    along with this:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814130056

    Im currently using my LCD's VGA connector which looks alright at its max resolution of 1360x768 except alot of the thing such as Aim boxes and winamp come out pretty big. Then I read about people going DVI from their video card to the LCD's hdmi using a DVI to HDMI adapter. So I picked one up and tested it out. I get a signal and windows fills the screen quite well with not too much appearing off screen. However it looks like crap. Sort of like what a video game system would look like plugged into a Computer tv tuner using s-video.(if you know what that looks like) Am I doing something wrong? i tried to follow this but had no luck:

    http://hardforum.com/showpost.php?p=1030790446&postcount=15

    thanx for any responses.
    Contact Email Syphic@gmail.com
    Quote Quote  
  13. Member
    Join Date: Mar 2003
    Location: Los Angeles
    Search Comp PM
    Any one? Some advice?
    Contact Email Syphic@gmail.com
    Quote Quote  
  14. Member edDV's Avatar
    Join Date: Mar 2004
    Location: Northern California, USA
    Search Comp PM
    I don't have an NVidia Purevideo card so I'm all eyes and ears. I like the pictures in that link. It helps the explanation.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  15. Member
    Join Date: Mar 2003
    Location: Los Angeles
    Search Comp PM
    Yeah Ive checked everywhere for help but seem to find no answers. Its all confusing. Im gonna try the AVS Forums.
    Contact Email Syphic@gmail.com
    Quote Quote  
  16. Hello

    Thought I would update this. I found an option in the ATI control panel:

    Reduce DVI frequency on high-resolution displays

    As soon as this was de-selected the TV was able to display a full 1080p image. I could then set the TV to Full Pixel option to remove overscan and the image is great.

    i had a few issues with lines flickering but i think that was a loose docking station connection, it was early this morning so i have additional testing to carry out tonight....

    Would like to know exactly what that option has done and i am searching the web to find out.
    Quote Quote  
  17. Member
    Join Date: Mar 2008
    Location: United Kingdom
    Search Comp PM
    i don't understand why so many people are experiencing problems connecting their pc's to their htv's and not getting a perfect picture??

    i even connected my t.v (toshiba 42x3030 -1080p) to my old pc which has a NVIDIA RIVA TNT2 Model 64/Model 64 Pro graphics card. now thats an old graphics card and still get a perfect picture without trying to resize anything, no overscan problems nothing. I'm using a simple ol vga cable because my t.v has a vga port so for someone to say its not possible to receive monitor quality picture on your hdtv is simply not true.
    Quote Quote  
  18. Member edDV's Avatar
    Join Date: Mar 2004
    Location: Northern California, USA
    Search Comp PM
    Originally Posted by t_def
    i don't understand why so many people are experiencing problems connecting their pc's to their htv's and not getting a perfect picture??

    i even connected my t.v (toshiba 42x3030 -1080p) to my old pc which has a NVIDIA RIVA TNT2 Model 64/Model 64 Pro graphics card. now thats an old graphics card and still get a perfect picture without trying to resize anything, no overscan problems nothing. I'm using a simple ol vga cable because my t.v has a vga port so for someone to say its not possible to receive monitor quality picture on your hdtv is simply not true.
    You see it as simple because you are using the VGA port and not HDMI or analog component which have the overscan issues. If your Toshiba accepts 1920x1080 over the VGA port then you are set. Not all 1080p sets accept 1920x1080p over VGA. That only happened in the latest generation. Note that this thread dates back to April last year.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  19. Member
    Join Date: Feb 2006
    Location: Emsworth, Hampshire, UK
    Search Comp PM
    I very successfully output 1920 x 1080p through my Radeon HD 2600 Pro using ATI's DVI-HDMI dongle which gives you sound via HDMI as well although I usually use sound out of my sound card on SPDIF
    Quote Quote  
  20. I connnect my Samsung LNT-4665 to my nVidia 8600GT with a DVI->HDMI cable. The video card is set to 1920x1080 60p and the TV is set to "Just Scan". The display is absolutely perfect. Pixel-for-pixel display with no overscan.
    Quote Quote  



Similar Threads