VideoHelp Forum
+ Reply to Thread
Results 1 to 3 of 3
Thread
  1. Member
    Join Date
    Jan 2006
    Location
    United States
    Search Comp PM
    I know what these are but am trying to better understand how they impact moving images & motion blur in LCD HDTVs. For this question let’s use a 720p60 signal which will have 60 full frames per second.

    How does a 120 Hz refresh rate help reduce motion blur? If it’s refreshing the display with duplicate data twice as often then why bother? IOW why display two identical 8 ms pixels instead of one 16 ms pixel?

    If it’s interpolating data & creating a new image from the normal 60 Hz images, does that mean a 600 Hz plasma HDTV creates 9 interpolated images? That does not seem correct.

    If you’re not creating a new image it would seem you want the pixels to turn off & on as fast as possible.

    Exactly what is 120 Hz (or 240 Hz) LCD HDTV doing to the original image & displaying it?
    Quote Quote  
  2. Member
    Join Date
    Jun 2003
    Location
    United Kingdom
    Search Comp PM
    Don't be fooled by the "600 Hz" claims for a plasma. That isn't the refresh rate, but the marketing people hope you'll think it's the refresh rate. "600 Hz" (10 sub-fields per frame) is 60 Hz refresh rate.
    Showing the same frame twice to give 120 Hz reduces flicker, not motion blur.
    Some TV's can create new interpolated frames, usually for 24 fps film display, which generally makes the movie look less like a film, and more like a TV soap shot on video.
    Quote Quote  
  3. Member
    Join Date
    Jan 2006
    Location
    United States
    Search Comp PM
    I guess I can understand 120 Hz reducing flicker if there was a lot of time between 60 Hz images & 120 Hz filled in the time. But I don't understand how two duplicate 8 ms images have less flicker than one "long" 16 ms image. I'm not saying that 120 Hz is not be better & I'm not challenging this. I'm just trying to understand why.

    This may not be a valid way to look at things for some reason, but take for example viewing a 35 mm projected slide image. It can be projected for a full second or more, which means zero refresh rate and yet has no flicker. And yet with video images a higher refresh rate is better than lower rate.

    IOW on one hand a zero refresh rate means no flicker & on the other hand a high refresh rate means no flicker. What is happening & how do we perceive images in between these two extremes?
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!