VideoHelp Forum




+ Reply to Thread
Results 1 to 4 of 4
  1. HAVE EDITED THIS POST: HAD THE RESOLUTION WRONG

    I don't really have a problem, just a question that has me curious. I have this older flat panel I'm using as a spare room TV. It's HD, but doesn't have HDMI, only component video (the red, green, blue) inputs. The manual says that the native resolution of the monitor is 1366 x 768. When I hook my blu-ray player to it, I find that the picture is MUCH sharper when the player is set to 1080i. This would of course normally make sense, but what has me confused is the fact that the native resolution of the TV is only supposed to be 768 to begin with. So how is 1080i making things look sharper, if my image is never really more than 768 lines anyway? Seems like the interlacing of 1080i would just make the picture look worse, since the TV can't really display 1080 lines. I'm very curious about why the 1080i looks so much better than 720p on a set that's only 768 to begin with. Any thoughts? It just has me curious.
    Last edited by maca; 2nd Mar 2013 at 17:06.
    Quote Quote  
  2. I'm starting to realize this question was probably stupid. When I first posted it, I thought the res of my monitor was only 720. Checking the manual and finding that it's 768, it makes sense that the higher res of 1080 would look sharper. 1080 downsized to 768 would of course look better than 720 blown up to 768. So I think I sort of managed to answer my own question.
    Quote Quote  
  3. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Your BDplayer is playing BDs that (almost) ALL 1080p24. Your Tv is (as you say) 720. That's probably p60. So to go fom one to the other, one or the other device has to do a resize down (1080 -> 720), a Telecine/FrameRateConvert (24 -> 30/60), and a possible deinterlace (not necessary if direct).

    What I'm guessing is happening in your case is that the algorithms for downsizing are optimal in your Tv as opposed to those in your player, and it just so happens that the telecine algorithms are better in your player than in your tv, though it has the byproduct of having it be interlaced. This isn't too surprising as TC has more natural motion if going to 60i vs. 30p (1080p60 is usually not a supported option). Then, going to p60 from there is like bobbing.

    There are a number of permutations you could try depending on the capabilities of your player and your tv but it common for one to excel in one featured algorithm and the other to be better in other algorithms. This usually comes down to chipset & price.

    Scott

    <edit> Yeah, saw your 2nd post...Downsizing is almost always better looking than upsizing.
    Quote Quote  
  4. Thanks. that explanation makes sense. I never thought about the fact that all I'm doing is choosing which machine does the down converting, and the TV obviously does it a little better than the player. As you can see, I corrected my post and the res of my monitor is really 768, but the same theory should apply.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!