I am trying to find some software that will be able to determine if a video file, in this case mpeg2, is interlaced or not. I have looked over many forum discussions on this topic and it seems that it is not possible for software to determine this, which I find very hard to accept.
If this is true, I am completely baffled as to how interlaced video is encoded. Surely the video player, whether it is a hardware device like a DVD player or a piece of software, must know how to display the video correctly, and surely this means knowing if the video is interlaced? Otherwise how can the player know if it is to combine the fields on an interlaced video into frames? If not, then a player might well display a PAL non-interlaced video at the rate of 50 full frames per second instead of 50 fields per second, or an interlaced video at 25 fields per second instead of 25 frames per second, dropping off half the information.
Can someone explain how the interlaced video is encoded (using mpeg2 as an example) and why the player does not need to know if its interlaced?
I am actually in the process of converting some mpeg-2 recordings into mpeg-4 h.264 in order to save space, using Avidemux. I have noticed that using a de-interlace filter helps with compression resulting in smaller files, but also that the output is softer (blurrier) with less detail. Without the de-interlace filter their is considerable combing artifacts on panned images (on the video I am currently doing my testing on) when displayed on my PC LCD display (although these artifacts are not visible on my plasma TV), so evidently it is interlaced.
Anyone able to help?
Thanks.
![]()
+ Reply to Thread
Results 1 to 21 of 21
-
-
You have two issues: how the video was encoded, and what the frames contain. Interlaced video can be encoded progressive (which screws up the colors) and progressive video can be encoded interlaced (it will still display correctly).
Each frame of true interlaced video contains two half pictures, called fields. One field is in all the even numbered scan lines, the other in all the odd numbered scanlines. The two fields are intended to be viewed separately and sequentially. So it's important to keep components of those two images separate when compressing and decompressing the video.
Encoding interlaced involves how the encoder handles the video internally to make sure components of the two fields don't interfere with each other. Obviously, interlaced video should be encoded in interlaced mode. But a progressive video can be encoded in interlaced mode without damaging the video (except for a small loss of color resolution with 4:2:0 color subsampling).
Depending on the container and codec it can be easy or difficult for software to tell if a video is encoded interlaced. MPG and VOB for example flag the encoding mode. Any program can read that flag. But the software still won't know if the frames contain interlaced fields or progressive pictures. Some programs attempt to analyze the frames to determine if their content is interlaced or progressive. But such software often makes mistakes.
The best way to tell if video is interlaced or progressive is by looking at it yourself using tools of known behavior. But what tools you use will depend on the container and codecs used. -
jagabo:
You say to encode in "interlaced mode" but I have not seen any setting that specifies the mode in Avidemux for h.264. I can insert a pre-processing de-interlace filter in the filters section, but have no idea how to specify whether i want interlaced output or not. Any idea where that might be or what it looks like?
I had just assume that if it was interlaced in it would be interlaced out, or if I inserted a deinterlacing filter then it would be interlaced in and progressive out.
Thanks. -
jagabo:
You say to encode in "interlaced mode" but I have not seen any setting that specifies the mode in Avidemux for h.264. I can insert a pre-processing de-interlace filter in the filters section, but have no idea how to specify whether i want interlaced output or not. Any idea where that might be or what it looks like?
I had just assume that if it was interlaced in it would be interlaced out, or if I inserted a deinterlacing filter then it would be interlaced in and progressive out.
Thanks. -
jagabo:
The plot thickens. I now know that I had the AVC encode set up to produce a progressive scan output. So now I am faced with trying to understand how the combing effects did not show up on my plasma TV on the file I made that did not use a de-interlace filter. Surely the output must have been coded with the fields interleaved, so the combing should have shown up. How did the plasma some how know that the progressive output had interlaced frames interleaved and then apply some kind of de-interlace filter to it?
The more I learn the more confused I am getting. -
How did the video get to the plasma screen? Was it being played by the TV? By an external player hooked up via composite or s-video?
-
It was streamed from a PC using TVersity via wireless ethernet to a WD (Western Digital) Live Streamer TV media player connected to a Panasonic plasma via an HDMI cable. Note that the settings in TVersity have the transcode feature set to "never" so assuming this is working correctly the video should be streamed in its original format.
I have just realised that having combing artifacts in a video does not actually mean its interlaced at all. It could be a progressive video made from an originally interlaced source that either did not use a de-interlace filter or used one that was incorrectly set up, or did a simple weave de-interlace, which wouls leave the combing artifacts in the video. This is now obvious to me since I have realised that the output from my AVC encode was progressive, and yet combing artifacts are visible.
I am now going to experiment with using interlaced output to see what happens then. I am afraid to use progressive output without an interlace filter as it will "freeze in" the combing artifacts to the video. Using a filter gets rid of the combing artifacts, but results in a very slightly blurry picture with loss of detail which I don't like. At least if I produce an interlaced output I could use a de-interlace filter at some later stage.
Last edited by stolennomenclature; 30th Mar 2012 at 21:19.
-
I am getting the following file sizes after conversion using various settings as follows, all using a quality setting of 24:
383,124 Kb - progressive scan using ffmpeg de-interlace filter.
455,483 Kb - progressive scan output with no deinterlace filter.
535,858 Kb - interlaced output. -
I think I am beginning to come to an understanding on why people say you cannot determine if a video is interlaced. The confusion seems to be a result of people not being clear about what they mean when they use the term "interlace". The same term is being used to denote whether the source material is interlaced (the original and real meaning of interlaced - meaning a series of temporally separate fields) versus a video file where the information is stored in an interlaced format (e.g. a series of separate fields which may or may not be temporally separate). In other words you can get a progressive source and encode it as if it were interlaced, by splitting the progressive frame into two fields. However, the two fields are not temporally separate - they came from the same frame shot at the same time. This is a kind of pseudo-interlace, where the interlace is being used to package the video in a format suitable for display on a CRT via a simple piece of hardware, but which is essentially a progressive video in the real sense of the word.
Of course, running a de-interlace function against this source will likely make it worse - it certainly can't be made "better" since it is already perfect.
Last edited by stolennomenclature; 1st Apr 2012 at 20:17.
-
And you can have progressive frames split into separate fields, then recombined out of phase:
Frames: 1 2 3 4 5 6
fields: 1t 1b 2t 2b 3t 3b 4t 4b 5t 5b 6t 6b
recombined out of phase: 1b2t 2b3t 3b4t 4b5t 5b6t 6b
Every frame looks interlaced but you can recombine the fields in phase and they all become progressive again:
out of phase: 1b2t 2b3t 3b4t 4b5t 5b6t 6b
split into fields again: 1b 2t 2b 3t 3b 4t 4b 5t 5b 6t 6b...
recombined in phase: 2t2b 3t3b 4t4b 5t5b 6t6b... <-- progressive again -
-
Any decent encoding program should be able to analyze an mpeg file to determine if it's truly interlaced. I've never used Avidemux but after opening and indexing any video for encoding using MeGUI, you can run an analysis on it. When the AVISynth Script Creator window opens, the analysis function is under the Filters tab. MeGUI should determine the type of source with a reasonable degree of accuracy and automatically pick an appropriate de-interlacing method.
MeGUI has an option to Auto Force Film if the source is determined to be 95% film (or greater) which is enabled by default. I always disable that option.
The way I understand it (and I'm partly adding this so someone can correct my understanding if it's wrong), 23.976 fps DVDs should always be progressive, 25 fps DVDs can be progressive or interlaced, and 29.970 fps DVDs should always be either interlaced or have pull-down applied (or maybe some combination of the two).
I live in PAL-Land so I don't have to think about the horrors of NTSC too much, for us it's either progressive or interlaced. -
Seems a great shame we are still having to worry about interlaced video, when interlacing is a technique that was only really useful in the days of analogue CRT televisions. I would have hoped that it would be at the very least fading away by now. Since digital TV more or less corresponded with the era of flat screen TV's, LCD and Plasma, that don't like interlaced source material (have to convert it prior to display) I would have thought there was a wonderful opportunity to bury it once and for all, and yet it seems most digital TV programs around the world, or at least in the US, UK and Australia, are still being transmitted mainly in interlaced format. This is a common feature of so many aspects of modern technology, where it is being hamstrung by backward compatibility with legacy technology. Although in this case there seems little excuse for it.
TV broadcasters should have no problems converting interlaced program material, and film has always effectively been progressive. I see no sensible reason why digital TV stations still transmit interlaced material, especially when you consider how it nearly all has to be converted prior to display on modern devices. Much more logical surely to transmit progressive material and have set top boxes convert it to interlaced to display on old CRT devices. Time for interlacing to take its final bow and disappear. -
Even with digital formats (MPEG 2, h.264) interlacing reduces bitrate requirements. That's not really needed for film sources but live sports (and news, etc.) encodes 60 different fields per second for smooth motion. 1920x1080 at 50 or 60 progressive frames per second would have suffered a bit from lack of bitrate.
Still I would rather have seen the switch to digital only include progressive modes. And there's no reason frame rate shouldn't have been more flexible. Modern TVs can all perform their own frame rate conversion. For example, 24 fps film sources should be broadcast at 24 fps progressive (In the USA it's frame repeated to 59.94 fps for 1280x720p60, or goes through 3:2 pulldown for 1920x1080i30).
Video/TV engineers were stuck in the interlaced mindset of standard definition analog video. -
Yes but interlaced 1920x1080 is more like 1920x540 in terms of the actual detail delivered to the eye even when displayed on a device capable of displaying interlaced content, such as a CRT. When displayed on a flat panel TV, it has to be converted to progressive, which involves throwing detail away.
Unless you are viewing this stuff on a plasma, or a very expensive 200hz LCD, a lot of the motion detail is thrown away by the display anyhow.
Is'nt there some other way this could be achieved? What about using 720p/60?
-
They were also stuck in the 25 / 30 fps mindset from the analogue TV days, originally determined by the mains frequency of the national power supplies (25hz in UK and Europs, 30hz (approx) in the US).
It really is about time that people sat down and formulated something entirely new. Quad HD (Cinema 4K) technology is beginning to be introduced - is this still to be 25 or 30hz interlaced too?
Much more sensible to use a film based 24fps than either of the world TV standard frame rates. Surely one day this will happen. B ut we must not hold our breaths.
-
Not really. Your eyes are fairly good at integrating the two fields.
Better TVs will use smart bob algorithms which can retain all the detail in still areas. And when there's motion your eyes aren't good at picking out detail anyway.
Not really. Any 60 Hz TV can show 1920x1080i with full 60 image per second. Watch any sports and you'll see it's much smoother than 24p film.
Yes, you get 60 different frames per second with video sources, the usual 3:2 frame repeat with film sources. But 1280x720 isn't a lot more resolution than standard definition. Especially for PAL viewers. -
Jagabo.
Resolution is definately lost during interlacing. I have heard that the ratio is about 70% for interlaced content displayed on a CRT, giving approx 756 lines of resolution for 1080i, hardly any better than 720p. On a flat panel monitor after de-interlacing I have no idea what the amount of lost information is in terms of a percentage, but I do know that an interlaced source I recently ran a deinterlace filter on showed very noticeable loss of detail when viewed on both LCD and Plasma displays. Most sources I have read on the subject of de-interlacing suggest there are very noticeable loss of detail during the process, which of course varies depending on which filter you use. Most opinion I have read state that the deinterlacing filters used on even the most expensive flat panel displays are worse then the software used on PC's, presumably explainable by the lower processing power available on the TV's and the need to do the deinterlacing in real time.
Your comment about most display being capable of showing detail during motion does not agree with most other sources I have read. Plasma TV's are commonly known to be better at motion than LCD's, and yet if you read the documentation on Panasonics web site, you will see that only there high end models can properly show the full 1080 lines of resolution on moving images. LCD's are significantly worse than Plasma in this regard. Most LCD's that claim high referesh rates use techniques (overdrive) that cause annoying artifacts, and most reviewers recommend turning these off. There is a trade off between detail and smooth motion anyhow - increase the detail and the motion get jerky - reduce the jerkiness and some detail is lost.
I cannot comment on anything to do with 3:2 pulldown since I (thankfully) live in a PAL country where such atrocities are not needed.
I do agree that 1080i makes much more sense in the US, where standard def is a very low 480 lines compared to PAL's 576 lines. It makes much less sense in a PAL country. -
One thing I have discovered during my recent adventures with interlaced content, is how well my plasma TV covers it up. Interlaced video that I have converted to progressive without using any de-interlacing shows very visible combing on my LCD TV and PC displays, but its almost invisible when shown on my plasma. Since the content is progressive, none of the devices should be trying to de-interlace the material (which I imagine would be impossible anyhow), so the difference would appear to be the result of the fundamental way LCD's and Plasmas display the pictures. I am guessing that it is the sample-and-hold type display of the LCD which maximises the visibility of the combing, whearas the rapid "flashing" type display of the plasma is effective in hiding it.
Interesting that I have never read of this before in any comparisons I have seen between LCD and Plasma TV's. Perhaps because relatively few people would be viewing converted progressive content?
Similar Threads
-
Determine current colorspace of a video file?
By mlong30 in forum EditingReplies: 1Last Post: 18th Jan 2012, 08:57 -
How to determine proper video size and aspect ratio?
By Bonie81 in forum Video ConversionReplies: 13Last Post: 25th Apr 2010, 02:57 -
how can I determine the format of a digital video
By coolpontiac in forum Newbie / General discussionsReplies: 1Last Post: 15th Jul 2009, 11:31 -
tool to determine video bitrate of a recorded DVD
By perfection in forum Newbie / General discussionsReplies: 2Last Post: 31st May 2008, 10:46 -
How to determine settings of web video?
By AnchorMan in forum Newbie / General discussionsReplies: 2Last Post: 26th Nov 2007, 08:12