VideoHelp Forum

+ Reply to Thread
Results 1 to 17 of 17
Thread
  1. I've converted an old Windows 10 Pro desktop computer to a living room HTPC, plugging it into my old Samsung 1080p TV (nothing special, really) through an HDMI 1.3 cable - rated at 1080p (not 4K, I don't need that), and the image looks like CRAP! The cable also has audio, which works fine. I had zero visual issues on any computer monitor, regardless of cable type. I had a cropping issue but I fixed that on the TV. I'm not seeing how to fix the blurry issue on the TV.

    I'm also not seeing how to fix it in Windows... it's this strange effect where text looks thin and has ghosting/shadowing, which makes it hard to read. I have text increased in Windows settings to 125%, which I also had on my previous HTPC (both at 1920x1080p). Everything looked fine on the old PC with just VGA.

    The colors look fine, and images and video look too pixelated. Nothing I changed in Windows settings or Intel graphics properties fixed it (and there's no attached 3D card, just an integrated 256MB video chip).

    The current computer is a Dell Inspiron 3647, CPU is Intel Core i5-4150, RAM is 4GB (soon to be 8), 64-bit; the graphics are Intel HD Graphics 4400. All drivers and Windows are updated. The old PC was a Core 2 Duo (also Windows 10 Pro 64-bit)...

    What do you think? Could it be the HDMI cable? I have other cables but HDMI 1.3 looked adequate, I'm not sure why I'd need a higher-rated one.
    Quote Quote  
  2. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Not the cable - that would either work or not work.

    Pretty sure this is a bad combo of screen size settings. You should be having the PC set to 1080p at 100%, and then the main thing is to have 1:1 pixel mapping on the TV's zoom level. This goes by a bunch of different names (dot by dot, just scan, native,...).

    Scott
    Quote Quote  
  3. Funny, I just found the button combination on the TV to make the text look good again...

    So, I had to go into Menu, then change the name of the HDMI source (input?) to PC, and disable the previous "PC" name to "----" (I don't know what the "PC" key means on this TV, but maybe it just refers to the VGA input).
    Quote Quote  
  4. Originally Posted by Cornucopia View Post
    You should be having the PC set to 1080p at 100%
    That didn't make any difference; I mentioned that I increased it to 125% which didn't fix it.

    I left my solution above.
    Quote Quote  
  5. Member Ennio's Avatar
    Join Date
    May 2005
    Location
    Netherlands
    Search Comp PM
    My experience with Samsung tv's is they tend to have overscan function on by default. Which indeed screws up pc display horribly. I still use a small Samsung 1080p tv as an extra monitor. This one has a dedicated HDMI input for pc.
    I think in your case, renaming input to "PC" disables overscan and - like Scott mentioned - 1:1 pixelmapping will be applied.
    Quote Quote  
  6. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    The manual for my 32-inch 2015 1080p Samsung TVs says to set the HDMI input's device type to "PC" when connecting a personal computer using an HDMI cable. My smaller 2012 1080p Samsung TV/Monitor has a specific HDMI connection to be used for connecting a PC. The connected computer is set to output 1080p resolution. I made no adjustments using the remote control key that controls zoom, stretch, or aspect ratio. The text displayed on the screen seems fine.
    Last edited by usually_quiet; 1st Aug 2022 at 23:36. Reason: fix a editing error
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  7. Yes, setting the input type to pc, or some TV's have a dedicated PC input, will automatically enable pixel-for-pixel mapping (aka disable overscan emulation). But you can usually enable pixel-for-pixel mapping on any HDMI input. On the old Samsung I have it's called "Just Scan", other manufacturers use different names. And make sure the PC is set to the TV's native resolution. Ie, 1080p output from the graphics card for a 1080p display.
    Last edited by jagabo; 1st Aug 2022 at 12:40.
    Quote Quote  
  8. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    One thing you need to know is the tvs has a lower dpi than monitors so on a computer screen to tv you will see some blur.
    I think,therefore i am a hamster.
    Quote Quote  
  9. Member Ennio's Avatar
    Join Date
    May 2005
    Location
    Netherlands
    Search Comp PM
    After years, I still am a bit confused what to think about "tv and pc color range". Surely one of you guys can comment on this.

    Connecting a pc to a tv, is there some color conversion taking place - anywhere? Actually I have always guessed not, because in the end a tv (or pc monitor, for that matter) has to process the offered data within it's own dynamics. But, to be able to do so properly, I thought that a tv does need to know - by headers, flags or whatever - within which range the source was encoded.
    I must admit always having doubted to be be right with this.

    Hopefully someone cares to confirm or correct me?
    Quote Quote  
  10. Video Restorer lordsmurf's Avatar
    Join Date
    Jun 2003
    Location
    dFAQ.us/lordsmurf
    Search Comp PM
    Yes, PC>TV involved colorspace conversion, from RGB to YUV.
    Some TVs compensate, some do not.
    Quote Quote  
  11. Member Ennio's Avatar
    Join Date
    May 2005
    Location
    Netherlands
    Search Comp PM
    Thank you, lordsmurf. I'm still not sure what RGB --> YUV for television would mean for the color range. As I understand - please do correct me if I'm wrong - both RGB and YUV can be in full range or tv range (for 8 bit, 0-255 vs 16-235). With a pc hooked up to a tv, is full range RGB converted to full range YUV or is it rescaled within "tv range" YUV? Hope I'm making any sense with this.
    Quote Quote  
  12. Digital computer monitors are traditionally full range RGB. R=G=B=0 is black, R=G=B=255 is white. All other colors are between those two extremes with RGB values between 0 and 255.

    Digital TV signals (and storage) are traditionally limited range YCbCr, often called YUV (though technically it's not exactly the same thing). This is a rotation of the RGB colorspace with a greyscale (Y) axis, and two color components (Cb and Cr) that represent colors that are added or subtracted from the greyscale value:

    Click image for larger version

Name:	cq5dam.web.1280.1280.jpeg
Views:	14
Size:	38.8 KB
ID:	66199

    https://www.intel.com/content/www/us/en/develop/documentation/ipp-dev-reference/top/vo...or-models.html

    The inner cube in that image is the full range RGB cube. The outer cube is the full range YCbCr cube. The black corner of the RGB cube is near the bottom center of the YCbCr cube at Y=16, Cb=128, Cr=128. The white corner of the RGB cube is near the top center at Y=235, Cb=128, Cr=128. The minimum Cb and Cr values that fall within the RGB cube (ie, map to legal RGB values) are 16 (the yellow corner, for example). The max that maps to legal RGB values is 240 (the blue corner, for example). Any YCbCr combination that falls outside the inner cube are not valid RGB colors. Those are the limited range YCbCr values that map to valid RGB colors. Note that only about 1/6 of the possible YUV combinations lead to valid RGB colors.

    Full range YCbCr would have the RGB cube slightly larger -- so that the corners touch the edges of the YCbCr cube (0 and 255). Full range YCbCr was usually limited to editing, not for distribution.

    What ultimately shows up on the screen is RGB (with some exceptions you don't need to worry about unless you are developing TV screens) and most media files are internally limited range YCbCr (DVD, Blu-ray, Youtube, Netflix, etc.). So somewhere along the line that YCbCr must be converted to RGB so you can see it (your eyes can't see YCbCR). That can happen in the media player, in the graphics card, or within the TV.

    These days computer graphics cards can be set to output RGB or YCbCr, either limited range or full range. Most monitors and TVs can be set to receive RGB or YCbCr, either limited range or full range. So you need to configure both to get a correct image.

    And we are also moving beyond the traditional 8 bit RGB/YCbCr to higher bit depths, high dynamic range video, different rotations, etc.
    Quote Quote  
  13. Member Ennio's Avatar
    Join Date
    May 2005
    Location
    Netherlands
    Search Comp PM
    I have to read this post a couple of times more to allow to sink in. I do understand and the picture absolutely explains.
    It just feels totally new and a bit difficult to instantly swallow. I've always thought of RGB as the upper realm of all possible colors. Learning that RGB valid colors are "a part of" YCbCr colors is somewhat weird, feels like upside-down and will take some time to be fully embraced.

    Thank you for this revelating explanation, jagabo.
    Quote Quote  
  14. Keep in mind that the axis of both coordinate systems extend from -infinity to +infinity. So they both define an infinite number of colors. The range of 0 to 255 is partly a matter of practicality and compromise (storage space, sufficient number of colors, headroom for overshoots, etc.).
    Quote Quote  
  15. Member Ennio's Avatar
    Join Date
    May 2005
    Location
    Netherlands
    Search Comp PM
    Yes, it came clear that what the axes represent would run infinitely. And a given depth of 8 bit makes up for the 0-255 cubes. For 10 bit, I can imagine a similar picture where the numbers would run 0-1023.
    Your link holds valuable basics of different "color models", as they are called. Also these do take a more than once reading, for me.

    Thanks again.
    Quote Quote  
  16. You might also consider adjusting the sharpness and contrast.
    Quote Quote  
  17. Member Ennio's Avatar
    Join Date
    May 2005
    Location
    Netherlands
    Search Comp PM
    Thank you.
    Quote Quote  



Similar Threads