I've always had a difficult time fully understanding the various concepts related to interlaced and progressive video. There's one question that I think I have the answer to, but the answer doesn't seem to jive with the general concensus that I think I've observed that progressive video is superior to interlaced.
It could be that I have my facts wrong, and if so, please correct me.
Interlaced video shows 60 "pictures" per second. Each picture is half the vertical resolution of the video that is displayed, and when each picture is displayed, a black horizontal line is inserted between each horizontal line in the picture. So if the video is 640x480, each picture making up that video is really only 640x240. I believe that 2 of these "half pictures" make up a "frame", although I'm a little unsure about that.
Progressive video shows 30 pictures per second, each of which is the same size as the video resolution (if the video is 640x480, each picture in the video is 640x480). I'm pretty sure these pictures are called frames.
Assuming the video source is interlaced (meaning it was recorded with a camera that captured 60 half pictures a second), wouldn't that interlaced video be superior to a video of the same exact thing that was filmed in a "progressive way" (with a camera that captured 30 full pictures a second), at least where the "fluidity" of motion is concerned? An object moving across the screen would be shown in 60 different positions per second in the interlaced video and only 30 different positions in the progressive video.
I know that having each interlaced picture be half the resolution of the progressive picture probably negtively affects the quality of the video, but isn't there some benefit to interlacing? It always seems like 1080p is viewed as being superior to 1080i in every way, but if what I said above is true, then is 1080i sometimes preferable, or at least could it be to some peoples' eyes?
The only thing I can think of that would make progressive always better than interlaced (assuming bandwidth isn't an issue) is if progressive is actually 60 full pictures per second. But I don't think that's the case, right? I realize I should probably just look that one up myself.
I know there are different frame rates like 24, 23.##, and 29.##, but to keep it simple I was just considering 30 frames per second.
Anyone care to shed some light?
+ Reply to Thread
Results 1 to 7 of 7
-
-
Short answer:
Yes 60 fields per second has better motion fluidity vs. 30p.
60p as in 1280x720p/60 has the best of both. That is smooth motion and resolution plus clear stop frame.
480p and 1080p usually use 24p film source with less motion fluidity. Film shooting technique is required.
1080p/60 uses so much bandwidth that it becomes impractical.
Long answer will have to wait.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Thanks for the info... it's starting to make a little more sense to me.
Say I have a blu-ray playing a 24 fps film on a TV that's only capable of displaying 1080i. Does the blu-ray player output 48 fields per second? If not, does it output 24 frames per second and does the TV convert those to 48 fields per second for displaying?
On a semi-related topic -- how does a non-CRT TV (LCD, for example) display interlaced video? From what I've read, I'm thinking it deinterlaces it, but I'm not sure why. Don't most flat panel TVs have a refresh rate of at least 60 Hz? If so, why can't they display 60 fields per second?
If they do deinterlace it, and I'm converting a DVD that contains interlaced video to another video format, should I deinterlace it, or is the hardware deinterlacing done by the TV better than the software deinterlacing that I would use (I'm using Handbrake). -
Originally Posted by Brent212
In 1080i output mode, the 24p material from the disc is telecined to 1080i/29.97 over HDMI. Then the TV inverse telecines back to 23.976p.
A "60Hz" HDTV will then repeat the frames 2:3:2:3 to 59.94fps (do the math) and display as 59.94p.
A "120Hz" HDTV will frame repeat 5:5:5:5 to 119.88p fps or will interpolate intermediate frames to attempt smoother motion.
A newer HDTV will accept a direct 23.976p feed from the Blu-ray player skipping the telecine/inverse telecine process. In theory the results from a Blu-Ray disc will result in the same quality display. The telecine process should be lossless.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by Brent212
Thus interlace video needs to be deinterlaced or inverse telecined (film source). Deinterlace quality varies HDTV to HDTV. Generally the newer the TV or the higher price class, the better the deinterlacer. The main explanation for difference in price for similar size HDTV sets is the quality of the deinterlacer/scaler. This mainly affects SD source. 1080i deinterlace isn't as difficult.
In almost all cases the real time hardware deinterlacer in the TV will perform better than non-real time software deinterlacers so interlace video should remain interlace to disc. Future HDTV deinterlacers will get even better.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by edDV
Originally Posted by edDV
Originally Posted by edDV
Also, deinterlacing to from 30i to 30p leaves you with half the temporal resolution. Motions will not be as smooth. Deinterlacing to 60p with a smart bob'er requires twice as much bandwidth. -
Originally Posted by jagabo
Hardware deinterlacers have a hierarchy that roughly follows parts and royalty cost.
Budget HDTV sets and progressive DVD players (think Wal-Mart) use the cheapest chips and generic technology.
Name brand house processors come at several price levels. Extra cost is mostly royalties to patent owners and custom chip costs. Higher end technologies require multiple fields or frames of memory. Last year's top of the line processor usually becomes more affordable in the second year and is replaced at the top with new technology.
The highest end consumer products borrow or scale down technology from pro level gear.
Many of the key patents that date to the 70's to 90's eventually expire and will become available to generic chip makers so the low end improves as well.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about
Similar Threads
-
NTSC : progressive or interlaced
By mathmax in forum Authoring (DVD)Replies: 54Last Post: 2nd Feb 2012, 07:06 -
Progressive Vs Interlaced?
By shagratt71 in forum Video ConversionReplies: 4Last Post: 26th Dec 2011, 09:22 -
slow motion by interlaced to progressive conversion?
By menczel in forum Video ConversionReplies: 9Last Post: 24th Apr 2011, 10:39 -
de-interlaced means progressive ?
By codemaster in forum EditingReplies: 19Last Post: 23rd Dec 2010, 06:08 -
Interlaced or progressive
By rank in forum Newbie / General discussionsReplies: 4Last Post: 3rd Jul 2010, 16:41