VideoHelp Forum

Try DVDFab and download streaming video, copy, convert or make Blu-rays,DVDs! Download free trial !
+ Reply to Thread
Results 1 to 5 of 5
Thread
  1. Member
    Join Date
    Apr 2005
    Location
    Macedonia
    Search Comp PM
    I use standard vga connection to connect my pc to sony bravia 32v2000 lcd and the picture is fine, with a resolution up to: 1366X768.

    But recently i got a new graphic card ati hd 2400 pro so I wanted to try out the DVI-HDMI connection using (a budget) cable (no adapters). The PC properly identified the LCD as Sony TV. There was a picture, but it was not very good. I dont know how to explain, it was hard to watch it, the icons seemed "fuzzy", there may have been some blurriness, flickering etc.
    I tried all the available resolutions (my lcd supports up to 1080i) but they all looked bad. Previously, I disconnected the vga cable to avoid possible signal interference. I also disconnected my analog tv anthenna. It didnt help.

    Then I downloaded the latest drivers for my card and now the picture looks just slightly better and more stable but still, its looks far worse than the quality i get with the vga. Graphics and movies look OK, but text looks awful its hard for reading, slightly blurred.

    Is it because of the cheap dvi-hdmi cable?? Should i pay more for a better cable or should I move back to the good old vga? I remember having similar problems with cheap vga cable in the past, but that was solved after buying a better one with a ferrite core.

    Also, another question: what are the benefits of using hdmi vs vga? ok, it provides bigger resolution, but for example, if I play a standard dvd or DivX thorugh it, will it be "upscaled" to HD quality, will it look better than on the vga?

    Thanks
    Quote Quote  
  2. I'm a MEGA Super Moderator Baldrick's Avatar
    Join Date
    Aug 2000
    Location
    Sweden
    Search Comp PM
    I guess it's because of pixel mapping, http://pixelmapping.wikispaces.com/Pixel+mapping+explained . You should try use 1366x768 resolution but it may not be supported using dvi/hdmi.
    Quote Quote  
  3. What resolutions are you able to use over HDMI? Only the usual video resolution? 1280x720 and 1920x1080? That will cause the pixel mapping problem Baldrick refers to. The HDTV will have to rescale those to it's native resolution (plus overscan).

    If you can't drive your HDTV at its native resolution (and without overscan) via HDMI you should stick with VGA.
    Quote Quote  
  4. Member
    Join Date
    Apr 2005
    Location
    Macedonia
    Search Comp PM
    Thank you both for your replies. Baldrick, I checked the link, but thats "higher science" for me but thanks, I'll re-read it, hopefully i will understand.

    Jagabo, I tried different resolutions, not only 1280x720 and 1920x1080 were available. I also tried 1366X768 (which I use with VGA) through HDMI. The picture was there, overscaned, but it didnt look fine in comparison to what i get with VGA. As I mentioned before, it was hard to watch it, the text was slightly blurry (it had sort of "doubling", "ghosting" or how to call that, like if I was reading it with bad glasses) etc.

    To make things worse, now Im starting to notice some other graphic card problems too (unrelated to hdmi), which i will discuss in another topic: error messages, sometimes picture is not appearing after Windows boot-up, sometimes its fine??! etc. Who knows what is this?! Ghosts?
    Quote Quote  
  5. Originally Posted by vanjo
    I tried different resolutions, not only 1280x720 and 1920x1080 were available. I also tried 1366X768 (which I use with VGA) through HDMI. The picture was there, overscaned, but it didnt look fine in comparison to what i get with VGA. As I mentioned before, it was hard to watch it, the text was slightly blurry (it had sort of "doubling", "ghosting" or how to call that, like if I was reading it with bad glasses) etc.
    You are seeing the consequences of your HDTV resizing the HDMI frame for overscan. Even though you are sending a 1366x768 image to the HDTV it is enlarging the image to about 1400x800 then displaying the inner 1366x768 pixels. This is to remove the overscan area (which often contains noise and other junk you aren't supposed to see).

    When feeding that same 1366x768 frame via the VGA cable the HDTV probably isn't resizing the frame to overscan. With computers you generally need to see the entire frame. It would be hard to operate Windows if the start bar wasn't visible for example.

    So HDMI doesn't get you 1:1 pixel mapping whereas VGA does. 1:1 pixel mapping is critical for computer displays because you are working with very sharp signals -- things like small text, checkerboard fill patterns, etc. Normal video doesn't contain such sharp details. Actually, check your HDTV's setup -- some do have the option to use 1:1 pixel mapping for the HDMI port.

    Here's a simple explanation of why 1:1 pixel mapping is critical for computer displays:

    Consider a source of four pixels. Each pixel is represented by a value from 0 to 100. 0 is black, 100 is white, all the values in between are different shades of gray:

    0 100 0 100

    You have a black pixel, a white pixel, another black pixel, and another white pixel; you have two white peaks and two black valleys all of equal width. When drawn 1:1 on a LCD monitor that's exactly what you will see. A black dot, a white dot, a black dot, a white dot.

    Let's resize this four pixel image for a five pixel display. We must convert our four numbers into five in order to put them on the screen. We want to maintain two peaks and two valleys, all of equal width. How would you propose doing this? Maybe you would just add a black pixel to the end:

    0 100 0 100 0

    But that wouldn't be resizing, you're just adding a black border to our little picture. If you did this over a full 1280 pixel wide display, by adding a whole bunch of black pixels to the right edge, you wouldn't be using the entire display. Or, if you repeated this operation for every group of four pixels and you would get a pattern like this:

    0 100 0 100 0 0 100 0 100 0 0 100 0 100...

    You no longer have two equal sized peaks and two equal sized valleys. One of the valleys is twice as wide as the other. This would be obvious on the screen.

    OK, so maybe adding a white pixel to the right side will work better:

    0 100 0 100 100

    Over the entire display:

    0 100 0 100 100 0 100 0 100 100 0 100 0...

    We have a similar problem, some of the white peaks are now twice as wide. No matter which pixel you chose to duplicate you will have a visible artifact.

    Maybe next you would come up with the idea of adding 50 to the end:

    0 100 0 100 50

    repeated:

    0 100 0 100 50 0 100 0 100 50 0 100 0 100 50

    Now you have an indistinct transition between some of the pixels. Viewed on the screen you would see fuzzy artifacts.

    There are other techniques where groups of pixels are averaged together with sophisticated algorithms that try to minimize visual artifacts. But you will never be able to turn those four pixels into five while maintaining two equally sized peaks and two equally sized valleys. This is a fundamental problem in digital imaging: There is no perfect way to scale digital data.

    My HDTV is a 1080p model which has the option of 1:1 pixel mapping or regular 16:9 overscan. Here are some example close up photos of the screen displaying single pixel wide black and white stripes, photographed with a digital camera, in 1:1 mode (top) and with overscan (bottom):



    The pictures aren't real clear (I was just holding the camera up close to the TV) but you can see every line in the 1:1 image. There is obvious artifacting in the overscanned image.
    Quote Quote  



Similar Threads