VideoHelp Forum
+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 31
Thread
  1. Member
    Join Date: Jan 2008
    Location: Brazil
    Search Comp PM
    Hi there guys... been a while since I last posted here. I'm about to purchase a Smart TV (probably 42"), but I'm kinda worried since I've come to realize that probably all of them lack a DVI video input for PC's. I have an LG 23" monitor in which I plug both my PC (to the DVI input) and videogames (to the HDMI inputs, obviously).

    Well, after I couldn't find a Smart TV with DVI input, I thought about plugging the PC to the HDMI input on my 23" monitor just for testing. After solving a problem with the overscan (which was messing the boundaries of the image), I think it's perfectly clear that the image quality of the PC connected to HDMI port is QUITE INFERIOR to the DVI input, at least in my humble opinion. The colors in the HDMI are kinda too bright/strong and the image in the DVI input is way more crystal clear.

    I thought about testing the RGB PC input, but although I have the VGA adapter, I lack the cable. Is RGB PC image quality the same of the DVI? 'Cause I'm think that Smart TV's have RGB inputs, not sure though.

    Please forget my newbieness... I just don't wanna spend money on a Smart TV that will make my PC graphics look like crap. Are there Smart TV's with DVI video inputs out there, and if not, what is the other solution?
    Quote Quote  
  2. Member
    Join Date: Aug 2006
    Location: United States
    Search Comp PM
    You won't find any new TVs with DVI inputs. DVI on TVs was superseded by HDMI several years ago. You may find some with a VGA input, but even that is getting rarer.

    HDMI and DVI are electrically compatible. Many people successfully use a DVI to HDMI cable to connect their PC and TV. If connecting your PC to the TV using HDMI looks bad then most likely the TV and/or PC video card are not configured correctly. Also, many newer TVs have an HDMI port that is designated for connecting a PC. Look at the TVs you are considering and their manuals to find out if such an HDMI port is present. Chances are the TV's manual will also tell you how to set up both the TV and the PC.
    Last edited by usually_quiet; 23rd Dec 2013 at 15:35.
    Quote Quote  
  3. Member
    Join Date: Jan 2008
    Location: Brazil
    Search Comp PM
    Hey, thanks for the input... Well, my video card is brand new, it's a GTX 760. I just plugged it's HDMI output to the monitor. I have no idea if it's mandatory to configure anything else. Again, it's looking worse than the DVI...
    Quote Quote  
  4. Member
    Join Date: Aug 2006
    Location: United States
    Search Comp PM
    Originally Posted by rotchild View Post
    Hey, thanks for the input... Well, my video card is brand new, it's a GTX 760. I just plugged it's HDMI output to the monitor. I have no idea if it's mandatory to configure anything else. Again, it's looking worse than the DVI...
    You need to make sure the video card's output resolution and refresh rate match the monitor's native resolution and recommended frequency (usually 60Hz). There are also usually some color adjustments that can be applied using the video card's user interface. Control Panel -> Appearance and Personalization -> Adjust Screen Resolution -> Advanced Settings will give you access to the necessary controls from Windows and the video card's user interface.

    However, whatever TV you buy will be different from your monitor so perhaps you should wait until you actually connect the PC and TV before jumping to any conclusions about what the TV's picture will look like.
    Quote Quote  
  5. Make sure the graphics card is set to the native resolution of the TV, for example 1920x1080p60 RGB. Don't use overscan compensation on the computer, disable overscan on the TV. That will give you perfect pixel-for-pixel mapping, just like a computer monitor.
    Quote Quote  
  6. Member
    Join Date: Jan 2008
    Location: Brazil
    Search Comp PM
    Well, I found this... http://www.tomshardware.com/answers/id-1781503/hdmi-dvi-monitor.html

    But I can't seem to find the separate settings he talks about. Maybe I did, but the picture quality still looks the same (HDMI inferior/WAY more blurry and colored than DVI)
    Quote Quote  
  7. DVI and HDMI are the same signal -- the only difference is the shape of the connector. Some TV's treat DVI differently than HDMI since DVI will usually be a computer where you want pixel-for-pixel mapping with no processing. But HDMI inputs can usually be configured the same way.
    Quote Quote  
  8. Member
    Join Date: Jan 2008
    Location: Brazil
    Search Comp PM
    Well, it's not a brand new monitor in which I'm testing it. It's an LG 23" (M237WA). I found a VGA cable and DVI-VGA adapter, just to test out the RGB PC input of the monitor. Picture quality is between the DVI and HDMI (worse than the DVI, but better than HDMI). The only problem is that the boundaries are exceeding the screen to the top and to the left. Bottom and right of the screen are fine...

    Could it be that the monitor is not like, brand new (released maybe more than a couple of years ago)... ?

    I'm out of ideas, and almost giving up...
    Quote Quote  
  9. Originally Posted by rotchild View Post
    the image quality of the PC connected to HDMI port is QUITE INFERIOR to the DVI input
    So why don't you just use the DVI input?
    Quote Quote  
  10. Member
    Join Date: Jan 2008
    Location: Brazil
    Search Comp PM
    I am using it... I'm afraid that, when I purchase a new Smart TV, since it has no DVI input, only HDMI and (maybe) VGA, the image quality will plain suck
    Quote Quote  
  11. Originally Posted by rotchild View Post
    I am using it... I'm afraid that, when I purchase a new Smart TV, since it has no DVI input, only HDMI and (maybe) VGA, the image quality will plain suck
    http://forum.videohelp.com/threads/361116-DVI-inputs-on-Smart-TV-s-over-40-inches?p=22...=1#post2290149

    Just get a TV that has proper pixel-for-pixel mapping and the ability to set the correct levels. It's nothing new. Our six year old Samsung HDTV has this. Even our 22" Vizio HDTV in the bedroom has it.
    Quote Quote  
  12. Member
    Join Date: Jan 2008
    Location: Brazil
    Search Comp PM
    OK, thanks again!
    Quote Quote  
  13. My Samsung LN-T4061F LCD from 6 years ago seems to introduce 4:2:0 color subsampling as part of its HDMI processing, regardless of the signal type being sent to it. Hopefully no TVs do that now.
    Quote Quote  
  14. Originally Posted by vaporeon800 View Post
    My Samsung LN-T4061F LCD from 6 years ago seems to introduce 4:2:0 color subsampling as part of its HDMI processing, regardless of the signal type being sent to it. Hopefully no TVs do that now.
    Even with 1080p RGB input? Maybe that generation didn't support 1080p60 RGB? Our 4665 doesn't have that problem. If we set the computer to output 1080i it shows chroma subsampling issues -- but that's because the graphics card uses YCbCr with chroma subsampling in that mode. In any case, almost all new HDTVs support 1080p60 RGB.
    Quote Quote  
  15. Member
    Join Date: Jan 2008
    Location: Brazil
    Search Comp PM
    Interesting... color subsampling? So you say that my monitor (I can't precisely say when it was manufactured, but if not 6 years old, it's most probably almost 4) may be doing the same thing? Hopefully that's it...
    Quote Quote  
  16. If you set the graphics card to RGB mode you shouldn't have chroma subsampling issues.
    Quote Quote  
  17. Member
    Join Date: Aug 2006
    Location: United States
    Search Comp PM
    Originally Posted by rotchild View Post
    Interesting... color subsampling? So you say that my monitor (I can't precisely say when it was manufactured, but if not 6 years old, it's most probably almost 4) may be doing the same thing? Hopefully that's it...
    No. He means Samsung TVs or LCD Tvs in general from 6 years ago might have that fault, not monitors. You keep confusing LCD PC Monitors and LCD TVs. The two things may look the same but are not. TVs are normally capable of correctly processing a greater variety of video signals than monitors.
    Quote Quote  
  18. Member
    Join Date: Jan 2008
    Location: Brazil
    Search Comp PM
    Originally Posted by jagabo View Post
    If you set the graphics card to RGB mode you shouldn't have chroma subsampling issues.
    You mean this?

    Name:  RGB.jpg
Views: 343
Size:  27.3 KB

    Sorry, but it doesn't change anything on how HDMI is still looking inferior. Can it be the quality of the HDMI cable?
    Quote Quote  
  19. Member
    Join Date: Aug 2006
    Location: United States
    Search Comp PM
    It could be a HDMI handshake problem, but there is only a small chance that the handshake problem is due to a faulty cable. You could try shutting down your computer, turning off the monitor, re-connecting the HDMI cable and turning everything back on again.
    Quote Quote  
  20. Originally Posted by rotchild View Post
    Originally Posted by jagabo View Post
    If you set the graphics card to RGB mode you shouldn't have chroma subsampling issues.
    You mean this?

    Attachment 22326

    Sorry, but it doesn't change anything on how HDMI is still looking inferior. Can it be the quality of the HDMI cable?
    Yes, I meant that. YCbCr 4:4:4 should be fine too. With chroma subsampling white text on a black background will still be perfectly sharp. But colored text on a colored background will be fuzzy. Because color information is sent with lower resolution than the greyscale picture.

    Did you set the graphics card to the TV's native resolution? Did you disable overscan compensation on the graphics card? Did you disable overscan on the TV?
    Quote Quote  
  21. Member
    Join Date: Aug 2006
    Location: United States
    Search Comp PM
    Originally Posted by jagabo View Post
    Originally Posted by rotchild View Post
    Originally Posted by jagabo View Post
    If you set the graphics card to RGB mode you shouldn't have chroma subsampling issues.
    You mean this?

    Attachment 22326

    Sorry, but it doesn't change anything on how HDMI is still looking inferior. Can it be the quality of the HDMI cable?
    Yes, I meant that. YCbCr 4:4:4 should be fine too. With chroma subsampling white text on a black background will still be perfectly sharp. But colored text on a colored background will be fuzzy. Because color information is sent with lower resolution than the greyscale picture.

    Did you set the graphics card to the TV's native resolution? Did you disable overscan compensation on the graphics card? Did you disable overscan on the TV?
    He's trying to fix a problem with his LCD monitor's HDMI connection at present. He hasn't bought the TV yet.
    Quote Quote  
  22. Member
    Join Date: Jan 2008
    Location: Brazil
    Search Comp PM
    Originally Posted by usually_quiet View Post
    He's trying to fix a problem with his LCD monitor's HDMI connection at present. He hasn't bought the TV yet.
    Precisely.
    Quote Quote  
  23. Member turk690's Avatar
    Join Date: Jul 2003
    Location: ON, Canada
    Search Comp PM
    I have experienced this before. I have an LG M237WA TV that has 2 HDMI, 1 DVI, and a PC D-sub inputs. Using any of the TV HDMI inputs to connect a VGA card with a DVI output through a DVI-to-HDMI cable always produces inferior-looking images compared with just plugging the card DVI output straight to the TV DVI input. There seemed to be something associated with DVI & analog D-sub=VGA cards, HDMI=blu-ray players, etc. This was my impression, at least until I tried connecting a professional NEC LCD with DVI-to-HDMI with same initial grotty results. I went through the NEC settings and one of them was for picture gamma; I don't recall specific figures, but gamma presets for PC output was different from HD video; custom ones could be assigned. When I set it to PC, the picture became OK, as much as would be expected from a VGA card output.
    So, it's still true that DVI and HDMI are electrically the same, but some settings are associated with one or the other and may or may not be changed, depending on the monitor. In my experience, I noted that on the LG I had, overscan can be changed with HDMI but not on DVI inputs (neatly associating HDMI with HD video, and DVI with PC video card output, where overscan is seldom needed or a concern). And unlike the NEC, the LG had no settings for changing gamma on individual inputs so forcing a DVI-sourced signal (which its designers (not nearly correctly) assume can only come from a VGA card) produces the less-than stellar picture quality through the HDMI inputs.
    Tablet? No, I don't have a tablet. I have a life.
    Quote Quote  
  24. Member
    Join Date: Aug 2006
    Location: United States
    Search Comp PM
    Originally Posted by rotchild View Post
    Originally Posted by usually_quiet View Post
    He's trying to fix a problem with his LCD monitor's HDMI connection at present. He hasn't bought the TV yet.
    Precisely.
    I looked the model number M237WA online to try to download the manual. LG only lists a TV/monitor (M237WA-PT) not a monitor for M237WA. A TV/monitor is quite a bit different than a standard computer monitor. They are more like a TV than a monitor. Most, but not all, have a TV tuner. All of them I have seen have analog video and audio inputs, and HDMI connections, plus a computer connection, usually VGA but (in your case) a DVI connection. Is that what you have?

    If so case jagabo's suggestions may well be applicable to your situation. Try them. The other thing I have seen is that there is a PC setting for the HDMI input on some TVs/Monitors.

    Unfortunately, I kept getting a server error instead of a download for the PDF manual, so I can't see what the manual says about the HDMI connections.
    Last edited by usually_quiet; 24th Dec 2013 at 00:35.
    Quote Quote  
  25. Member
    Join Date: Jan 2008
    Location: Brazil
    Search Comp PM
    Guys, please go slower... this is a newbie section. My newbieness is so evident, I haven't even heard of a company called NEC Display Solutions before. But whatever... messing about with the gamma settings inside the NVidia control panel kinda showed some progress, but image quality in HDMI is STILL way lower than DVI... I don't know, I guess the correct word is BLURRED. HDMI seems blurred, while DVI seems pixel perfect, ya know? When playing PC games it is more than evident. I wish I could show you guys, but taking snapshots of my monitor won't help at all, I guess.

    @ usually_quiet: Thanks for looking into it. About jagabo's suggentions... I have no idea what is the monitor's native resolution, I guess it's 1920x1080... can't find anything about overscan compensation on the videocard, neither on the monitor

    I still hope that turk's right about it being a manufacturers/designers "fault". I'll try testing it tomorrow in a Sony Bravia 42" (my brother has one), and will tell the results.

    Thanks again guys!
    Quote Quote  
  26. Member
    Join Date: Aug 2006
    Location: United States
    Search Comp PM
    I was able to download a copy the M237WA's manual in English from a different LG website this morning. The manual says this on page 12: "HDMI Input does not support PC mode. If it is connected PC, the screen may not be displayed properly." So the problem is as turk690 described. Your TV/monitor's HDMI connections do not support PC mode signals.

    For what it is worth, the M237WA's native resolution is 1920x1080 and its recommended refresh rate is 60Hz.


    My onboard AMD 780G video card does provide some HDTV support settings, although I have no idea whether or not the 1080p signal it outputs using its HDTV setting is actually different from the one that results from using its ordinary monitor output settings. I have not used a NVDIA video card recently. Does your video card have any HDTV support settings to try?
    Last edited by usually_quiet; 24th Dec 2013 at 07:19.
    Quote Quote  
  27. Most TVs let you set the name that's displayed on-screen when switching between inputs. Options might include HDMI, PC, DVD, Blu-ray, Cable TV, DVR, etc. Processing settings are often tied to that name. So setting the name to "PC" might turn off overscan and give you the correct levels. This probably won't help you since your manual explicitly says PC mode isn't supported on the HDMI inputs.

    Also, turn off all "auto" controls on the TV. Auto contrast, auto color, auto skin tone, noise reduction, sharpening, etc. Those all screw up the "perfect" picture coming from a computer graphics card. Use the "just scan" or "pixel-for-pixel" or "1:1" picture mode -- different manufacturers use different names. There's usually a setting that controls black/bright levels. Often called "high/low", or "full/limited" or "0-255/16-235" etc. Use an RGB greyscale test pattern to select the setting that delivers RGB=0=black, RGB=255=white. The computer also has levels controls for converting YUV to RGB (when putting out RGB over HDMI), or RGB to YUV (when putting out YUV, YCbCr) over HDMI. I usually set the computer to output RGB then use an RGB test pattern displayed in an image viewer (not a media player which may muck around with levels) to adjust the TV for blacks and whites.
    Quote Quote  
  28. Even when the output/input is RGB it appears most TVs do chroma subsampling internally. This pic should hopefully tell you what your TV is doing.
    http://madshi.net/madVR/ChromaRes.png

    The HDMI and DVI signals are identical, so if the HDMI and DVI inputs produce a different picture, then the TV is doing processing on at least one of them, most likely the HDMI input. You may or may not be able to disable it. In the case of my Samsung TV (which is a couple of years old) there's no DVI input, so I have to connect my PC using a DVI to HDMI cable. Or VGA... as it has a VGA input. The TV always applies some processing to the HDMI input even when all the processing is disabled in Picture Settings. There's only one way to disable it completely and stop the TV from doing it's internal chroma subsampling.... explained later.

    HDMI/DVI cables either work, or they don't. There's really no "poor picture quality" problem with HDMI/DVI cables. Well, that's not strictly true, but for the purposes of this discussion I'd say it is. If the DVI and HDMI inputs produce different quality video, then it's most likely the TV's fault because the input signal should be the same. http://www.cnet.com.au/why-all-hdmi-cables-are-the-same-339325216.htm
    Well I guess it's possible if a video card has a DVI and a HDMI output the video card itself might apply different processing to each output by default. While maybe it's not likely (I've never owned such a card) it might be worth checking. I disable all "picture enhancement"crap in the Nvidia Control Panel.

    In the case of Sumsung TVs.... mine in particular as it's the only one I can play with.... there's only two ways to avoid the TV's picture processing (although it always adjusts brightness according to the picture it's displaying) and that's by using the VGA input, in which case the TV basically acts as a monitor. The other way is to use the dedicated HDMI/PC input (HDMI 1), but even then..... you have to go to the Source menu in the TV and rename the HDMI 1 input as "PC" or "DVI PC", and the input resolution must be 1920x1080@60hz. If those conditions are met the TV will then go into "PC mode". The picture settings menu will have many of the usual options greyed out. The "Movie" picture mode will no longer be available and the picture menu looks basically the same as it does when using the VGA input. I'm pretty sure there's no chroma subsampling.
    I don't use "PC mode" myself because I don't use the TV as a monitor (just for displaying video) and I prefer to run it at 50hz. It'll still let me disable overscanning any the processing it does still apply (with all processing disabled) is pretty minimal.

    The above is how Samsung TVs work, much of which I only discovered recently myself thanks to the kind poster who started this Doom9 thread. I'd always wondered why HDMI 1 was the "PC input" when it seemed the same as the other HDMI inputs. How other TVs work, I have no idea.

    For me, connecting the PC to the TV via HDMI produces a slightly better picture than VGA does, mainly in respect to fine detail (although the quality of the VGA cable can be a major quality factor when it comes to VGA). I have a Bluray player connected to the same TV via HDMI and with the same TV picture settings for each HDMI input the picture quality is basically the same (there's a very slight difference but I'd be thinking it's on the decoding/upscaling end and nothing the TV is doing).
    Last edited by hello_hello; 24th Dec 2013 at 08:13.
    Quote Quote  
  29. Originally Posted by usually_quiet View Post
    I was able to download a copy the M237WA's manual in English from a different LG website this morning. The manual says this on page 12: "HDMI Input does not support PC mode. If it is connected PC, the screen may not be displayed properly." So the problem is as turk690 described. Your TV/monitor's HDMI connections do not support PC mode signals.
    I sounds like the TV applies processing to the HDMI inputs. It'd probably make a mess of text, hence "the screen may not display properly".
    Quote Quote  
  30. HDMI is backward compatible with DVI,

    DVI not support YCbCr - only RGB color space is supported by DVI,

    HDMI receiver (sink in HDMI/DVI terminology) automatically detect DVI source and should switch to:

    exact pixel mapping, color range 0 - 255

    If TV behave incorrectly this mean that it is not DVI compliant and this is HDMI standard violation too.

    Raise issue to TV vendor, request new firmware for TV with improved behavior.
    Quote Quote  



Similar Threads