I know what these are but am trying to better understand how they impact moving images & motion blur in LCD HDTVs. For this question let’s use a 720p60 signal which will have 60 full frames per second.
How does a 120 Hz refresh rate help reduce motion blur? If it’s refreshing the display with duplicate data twice as often then why bother? IOW why display two identical 8 ms pixels instead of one 16 ms pixel?
If it’s interpolating data & creating a new image from the normal 60 Hz images, does that mean a 600 Hz plasma HDTV creates 9 interpolated images? That does not seem correct.
If you’re not creating a new image it would seem you want the pixels to turn off & on as fast as possible.
Exactly what is 120 Hz (or 240 Hz) LCD HDTV doing to the original image & displaying it?
Try StreamFab Downloader and download from Netflix, Amazon, Youtube! Or Try DVDFab and copy Blu-rays! or rip iTunes movies!
+ Reply to Thread
Results 1 to 3 of 3
Thread
-
-
Don't be fooled by the "600 Hz" claims for a plasma. That isn't the refresh rate, but the marketing people hope you'll think it's the refresh rate. "600 Hz" (10 sub-fields per frame) is 60 Hz refresh rate.
Showing the same frame twice to give 120 Hz reduces flicker, not motion blur.
Some TV's can create new interpolated frames, usually for 24 fps film display, which generally makes the movie look less like a film, and more like a TV soap shot on video. -
I guess I can understand 120 Hz reducing flicker if there was a lot of time between 60 Hz images & 120 Hz filled in the time. But I don't understand how two duplicate 8 ms images have less flicker than one "long" 16 ms image. I'm not saying that 120 Hz is not be better & I'm not challenging this. I'm just trying to understand why.
This may not be a valid way to look at things for some reason, but take for example viewing a 35 mm projected slide image. It can be projected for a full second or more, which means zero refresh rate and yet has no flicker. And yet with video images a higher refresh rate is better than lower rate.
IOW on one hand a zero refresh rate means no flicker & on the other hand a high refresh rate means no flicker. What is happening & how do we perceive images in between these two extremes?
Similar Threads
-
Choppy Video or Refresh rate lag
By juliushibert in forum Media Center PC / MediaCentersReplies: 3Last Post: 18th Sep 2011, 11:35 -
Can't Get My Monitor and HDTV to Work at the Same Time
By wulf109 in forum ComputerReplies: 2Last Post: 6th Jul 2010, 21:36 -
Video Playback (FPS) Vs. Monitor Refresh Rate (Hz) - Any problems?
By prankstare in forum Newbie / General discussionsReplies: 13Last Post: 14th Aug 2009, 10:21 -
Embedding Cost/Time/Rate Counters
By Zeek in forum EditingReplies: 4Last Post: 29th Mar 2009, 20:04 -
LCD HDTV, leave it on all the time?
By moviegeek71 in forum DVB / IPTVReplies: 3Last Post: 11th Jun 2008, 19:34