VideoHelp Forum
+ Reply to Thread
Results 1 to 27 of 27
Thread
  1. Member
    Join Date
    Dec 2015
    Location
    Amsterdam, The Netherlands
    Search PM
    Solving the problems with my videoclips by trial and error is just too frustrating; I whish I knew more about it but I can't seem to find info which is comprehensible with my very modest level of knowledge. For instance: I know more-or-less what a 'codec' is supposed to do but many practical questions remain, for instance:
    1. Are codecs installed under the OS, for common use? Or do video-players come with their own codecs? Or is it a combination of the two? (I seem to have read, for instance, that VLC player installs it's own codecs. If so: why?) This question is prompted by the fact that, on my notebook, WMP plays a certain video well, VLC plays it jerkily while Irfanview says it's missing a codec.
    2. As I would expect, some video formats seem to require more processing power to play then others. However, when I compare different computers, I can't make head nor tails of the results: an ageing notebook plays almost no HD format well, except for m2t files which can be played flawlessly with Windows Media Player (but not with other players). Our mini-notebook has no trouble handling Mts files (of which I read they require lots of power) but plays the m2t files very badly. Our (android) tablet plays mts files jerkily and without sound, but handles m2t files fine. I used the same files on all 3 systems of course and tried lots of other formats but you'll get the idea. Quite obviously, these differences can not be explained by mere differences in CPU power. So what else could be the explanation?
    Mabel
    Quote Quote  
  2. 1 - nowadays lot of systems usually have very limited codec support - common practice is to avoid codecs that require some license fee - common codecs covered by royalties are mpeg-2, h.264, h.265, mp3 etc. Sometimes OS vendor pay this licensee fee and OS user pay for OS.
    Of course there is large number more or less free codecs - usually they not gaining high popularity due lack of wide industry acceptance.

    2 - this is more related to HW capabilities of your decoder - pure software decoders may use CPU and they usually suffer from described by you syndromes, nowadays lot of modern SoC will support some HW decoding acceleration which usually is best possible option from customer perspective - common HW decoders supported by today SoC are mpeg-2 and h.264. Additionally to HW capabilities you need to have software capable to use HW decoder itself and this is completely different problem as usually those HW encoders are proprietary and usually there is no freely available documentation. Intel use QSV technology, NVidia nvdenc etc.
    My advise is simple - use software capable support HW acceleration by your HW.
    Quote Quote  
  3. Originally Posted by Mabel View Post
    1. Are codecs installed under the OS, for common use?
    Some are, some are private to (built into) an application.

    Originally Posted by Mabel View Post
    Or do video-players come with their own codecs? Or is it a combination of the two?
    Some players come with their own built in decoders. In addition, some allow use of system installed decoders instead. WMP uses only system installed decoders.

    Originally Posted by Mabel View Post
    (I seem to have read, for instance, that VLC player installs it's own codecs. If so: why?)
    VLC comes with built in decoders for most common codecs. It cannot use system installed decoders. It can use the graphics card's ability to decompress some codecs.

    Originally Posted by Mabel View Post
    2. As I would expect, some video formats seem to require more processing power to play then others... these differences can not be explained by mere differences in CPU power. So what else could be the explanation?
    Some decoders use only the CPU to decompress video. Some can also use the graphics card's ability to decompress video (older graphics cards don't support hardware decoding, some work better than others). Some players handle certain containers (MTS, M2T, M2TS, TS, MPG, VOB, MKV, MP4, MOV, WMV, AVI, FLV, etc.) better than others. You can often improve a players performance by adjusting its internal settings. Graphics card drivers often mess up some players so updating drivers may fix or screw up a player.

    It's a jungle.
    Quote Quote  
  4. Member
    Join Date
    Dec 2015
    Location
    Amsterdam, The Netherlands
    Search PM
    Thanks a lot for replies.
    @Pandy: If I understand you correctly, you say it's all in the interplay between codecs and specific HW. No doubt you've got a point there, but as an answer to my question I find this hard to understand. I mentioned several video-formats (e.g. MTS, m2t, m2ts) and as far as I can find out these formats ALL contain streams compressed with the same mpeg-2 codecs. If it were only HW support for this particular compression that makes the difference, our computers would consistently stack up in the same order, but, as I decribed, they do NOT.
    How can that be explained? I think there's more to it then just hardware assistance.

    @Jagabo "It's a jungle"
    So, now you tell me! Thanks a lot - I'm already lost in that jungle!
    But anyway, your very helpful reply confirms some things I'd come to suspect. And am I right to understand that reading a container format is a function of the media-player? That this has nothing to do with codecs and hardware-accelleration? This would make sense to me, as my limited knowledge tells me that 'codecs' are bits of software which take on the task of compressing/decompressing video or audio and for some of these the hardware, notably the Graphics Processor, may contain special functions to speed things up. Correct?
    (Or is reading a container format a 'shared' function from some OS's (notably windows) too? I can't seem to find out.)

    Another thing I learned from your reply is that graphics drivers kind of form the surprise factor: while I can easily make sure I have the latest versions of player, codec and OS, and can also, albeit with some more effort, read-up on bugs in each of those, the bugs (or 'updates') in video-drivers are a very dark area and I had not sufficiently realised their role in all this.

    With all this in mind, though, I still struggle to explain the differences between my computers, since on all of them I use the same software to read the video's (WMP, VLC and Irfanview).

    It's all become way too complicated, I feel. Part of the problem, for me at least, seems to be the lack of understandable documentation and internet articles which (partly) contradict each other.
    For instance, the MTS files my camera produces usually play lousily; they do much better when rendered in the m2t format. Why? Wikipedia redirects from m2t directly to MTS and treats them like the same format. But they obviously are not, although both contain mpeg-2 encoded streams.

    I have one more question. I would expect it to be easy to find out what codecs are installed under the OS. But the program I now use, Gspot, is quite often unable to determine whether or not a certain codec is installed. How can this be? Is Gspot's maker a dummy (I suspect not) or is the Windows approach just such a mess?

    Mabel
    Quote Quote  
  5. Originally Posted by Mabel View Post

    For instance, the MTS files my camera produces usually play lousily; they do much better when rendered in the m2t format. Why? Wikipedia redirects from m2t directly to MTS and treats them like the same format. But they obviously are not, although both contain mpeg-2 encoded streams.

    I have one more question. I would expect it to be easy to find out what codecs are installed under the OS.
    .MTS is an extension most commonly associated with AVCHD files, a format used by many modern HD camcorders. That uses Mpeg4 encoding, rather than mpeg2.

    The .m2t extension is used with older HDV files, which are encoded as Mpeg2.

    For an equivalent quality file, the mpeg2 files will be larger than the MTS files. Mpeg4 is a more highly compressed format, which demands more computing power to decode.
    Hence the reason why the older, larger, but less compressed .m2t files play more smoothly. They need a lot less computing power.

    To see what codecs you have installed on your machine, try this little program: http://www.updatexp.com/sherlock-codec-detective.html

    It's free and it doesn't need to be installed. Just run the .exe.
    Quote Quote  
  6. Originally Posted by Mabel View Post
    And am I right to understand that reading a container format is a function of the media-player? That this has nothing to do with codecs and hardware-accelleration?
    More or less. Reading the container and splitting audio and video (and other) streams can be performed by the player or by system installed filters. A player will piece together a "filter graph" of internal and/or system components to play a media file. Here's what a typical graph looks like:

    Click image for larger version

Name:	graph.png
Views:	203
Size:	32.6 KB
ID:	35080

    Starting on the left you have a file reader, followed by a file splitter (aka demultiplexer), then audio and video decompressors and renderers. Sometimes the reader and splitter are the same filter. Sometimes other filters are added to overly subtitles, convert colorspace, etc. For system installed filters there is a priority system that specifies which filters have priority when you have multiple filters that can perform the same function. There are also programs that can let you change those priorities.

    The idea is to have a flexible, modular system where anyone can create their own containers, codecs, and renders and have them play on any Windows computer simply by installing the filters. Unfortunately, along with that flexibility comes complexity.

    Originally Posted by Mabel View Post
    This would make sense to me, as my limited knowledge tells me that 'codecs' are bits of software which take on the task of compressing/decompressing video or audio and for some of these the hardware, notably the Graphics Processor, may contain special functions to speed things up. Correct?
    Yes.

    Originally Posted by Mabel View Post
    For instance, the MTS files my camera produces usually play lousily; they do much better when rendered in the m2t format. Why?
    Within a container there are usually many options. Different programs (or hardware) may use different options when muxing streams. I suspect that the spec in many cases is incomplete or unclear. So one programmer may interpret the spec differently from another, leading to problems during playback.

    Originally Posted by Mabel View Post
    I would expect it to be easy to find out what codecs are installed under the OS. But the program I now use, Gspot, is quite often unable to determine whether or not a certain codec is installed. How can this be?
    I don't know about GSpot in particular but there are filter editors that will show you installed filters, their priorities, etc. But they can only show system installed filters. They will not show you filters built into various players and editors, and not available to other programs.

    Some players, like MPCHC, will let you see what filters are being used during playback.
    Quote Quote  
  7. Member
    Join Date
    Dec 2015
    Location
    Amsterdam, The Netherlands
    Search PM
    Originally Posted by pippas View Post
    .MTS is an extension most commonly associated with AVCHD files, a format used by many modern HD camcorders. That uses Mpeg4 encoding, rather than mpeg2.
    Thanks, Pippas, for your answer, which makes a lot of sense to me and explains several of my problems. I'd been wrong-footed, I now realise, by other sources. A Wikipedia search for 'MTS' redirects to the article .m2ts which says: " It is based on the MPEG-2 transport stream container." Another Wikipedia search for 'm2t" leads to the article "MPEG transport stream", which says: "MPEG transport stream (MPEG-TS, MTS or TS) is a standard container format [...] specified in MPEG-2 Part 1." etc. By then, the notion of 'mpeg-2' was firmly lodged in my head. The analyser I use, Gspot, didn't help either: it gives the file-type as "MPEG-2 Transport Stream" and the Mime Type as "video/mp2t" but says, to my dismay, nothing at all about the contents of this container.
    By now I am aware that all this is just about the container and, reading more closely, I now see that it indeed can contain mp4-encoded video and my files probably do, as the makers of amateur class video-camera no doubt strive to get as much video on the memory card as possible.

    However, although your answer does take away some of my question marks, I stil struggle to see how it can fully explain the diferences I observed between the various computers I tried and the various players I used. But maybe I should give up on my question now that I seem to get closer to practical solutions for my practical problems

    Mabel
    PS: The "sherlock-codec-detective" utility you mentioned seems to work fine on my windows7 computer - quite illuminating indeed! and thanks for mentioning it. Yet, the website clearly states the program is meant to work for Windows-XP only, so when it comes to explaining inexplicable malfunctionings, I doubt whether I can rely on this utility. We'll see.
    Quote Quote  
  8. Member
    Join Date
    Dec 2015
    Location
    Amsterdam, The Netherlands
    Search PM
    Originally Posted by jagabo View Post
    Reading the container and splitting audio and video (and other) streams can be performed by the player or by system installed filters. A player will piece together a "filter graph" of internal and/or system components to play a media file. Here's what a typical graph looks like:
    Wow, that's quite an illuminating chart you give. And impressive too! I am now starting to see what the programmers devising these mechanisms had in mind. I am also starting to see why most of the people I know don't even bother anymore to copy their video's to a computer; they play them right from the SD-card by simply connecting their video-camera to a TV or only on the smartphone or tablet they made them with. As a former IT manager (albeit in a very different field) I'd say that this level of inter-dependance between components of very different origin is bound to create problems. It is, in my experience, practically impossible to formulate a standard which cannot be interpreted in different ways.
    Thanks again for your responses and the chart; it already helped me to solve some problems by experimenting in a somewhat more systematic way.
    Mabel
    Quote Quote  
  9. Member
    Join Date
    Jul 2009
    Location
    United States
    Search Comp PM
    You may want to try Mediainfo instead of GSpot. Mediainfo will give you more information on the audio and video streams within the files. You will probably be able to find some common denominator, start with things like bit rate or resolution.
    Quote Quote  
  10. Member
    Join Date
    Dec 2015
    Location
    Oregon, USA
    Search Comp PM
    Also, just for clarity, file extensions (avi, mpg, mts, mp4, m2t, etc...) are containers ONLY, and does not necessarily tell what video/audio stream is contained within. As zing269 says, Mediainfo will tell you more accurately what the A/V stream inside the container is. Mpeg2, Mpeg4, AVC, Mpeg audio, Dolby digital audio, aac audio, etc... are different A/V streams that can be put within said containers. Subtitle streams of various types can also be added. And things can start to get complicated...

    Sent from my 831C using Tapatalk
    Quote Quote  
  11. Also, anyone can change the file extension. So the extension doesn't necessarily reflect the container. Most media players will examine the contents of the file to determine what the true container is. The term "container" here means how the data is organized internally. Different containers have different ways of organizing the data within them.
    Quote Quote  
  12. Member
    Join Date
    Mar 2011
    Location
    Nova Scotia, Canada
    Search Comp PM
    [QUOTE=Mabel;2427140]
    Originally Posted by jagabo View Post
    ... I'd say that this level of inter-dependance between components of very different origin is bound to create problems. It is, in my experience, practically impossible to formulate a standard which cannot be interpreted in different ways...
    I think you've pretty much hit the nail on the head there. The level of standardization in digital video is ridiculous, even by IT standards.

    Another thing with older GMAs is that they may lack newer directX/openGL support.
    Quote Quote  
  13. Member
    Join Date
    Dec 2015
    Location
    Amsterdam, The Netherlands
    Search PM
    @Zing264. Thanks a lot; I am now using MediaInfo which indeed provides a lot more info and is very helpful to me.

    @Southstorm and @jagabo: I think I do understand the concept of container; my original posting made that clear I hope. And by now I do understand that a MPEG2 Transport Stream (MTS) can contain Mp4-encoded video, much like an MP4-file, but differently organised. Also, it's not just with video's that the file-extension doesn't adequately describes the file-type
    My original questions were in fact partly about these differences in organisation of the container. Is it true that one container is more difficult to read then another (supposing they contain the same streams, of course)? Notably, is there any truth in the claim I read in numerous places, that the MTS-files of modern amateur-class video camera's are 'notoriously difficult to read' and 'require lots of CPU-power, which many devices don't have'? And if that is true, wouldn't remuxing to a more easy-to-read format be profitable? And why would camera makers choose such a difficult-to-read format?
    These questions still stand, but I have to add: what prompted these questions was the difference in behaviour between the original MTS-files and the MP4-files produced by a converter, which otherwise seemed inexplicable to me as both files seemed to contain the same video streams. By now I learned (thanks to another thread I started under the heading 'MP4 headaches' since the heading of this thread didn't seem appropriate anymore), that the files that did badly contained an interlaced stream, while the files that did well contained a non-interlaced (25p) stream. Is that indeed the explanation for their different behaviour? I'd like to think so, but nobody really confirmed this.
    Moreover, I do not really understand WHY. I can see how real-time de-interlacing can take lots of CPU-time, but I'm at a loss to see why de-interlacing is even necessary. (But with these last questions I am duplicating what I asked in the other thread so I'll stop here.)

    I'm still curious to find answers to the above questions about MTS though, since they mean the difference between finding solutions in faster hardware or in other video-players or in doing some pre-processing to make files more easily readable. Meanwhile, lots of thanks to this forum and the people on it!
    Mabel
    Quote Quote  
  14. Member
    Join Date
    Dec 2015
    Location
    Oregon, USA
    Search Comp PM
    MTS files are only difficult to read because they can contain various A/V types. This means any hardware to reliably read them MUST support all the various A/V types. This is not practical on most hardware hence the headaches. The same applies to software players on pc's, though it is easier to deal with on a pc.

    Regarding interlaced and progressive, this has a long history and is the cause of many problems when editing video as it's easier to edit a complete (progressive) frame than a separated (interlaced) frame. There is nothing good about interlacing. It is an old technology that has remained for backward compatibility with older systems. But since it still exists, we deal with it.

    Sent from my 831C using Tapatalk
    Quote Quote  
  15. Member
    Join Date
    Dec 2015
    Location
    Amsterdam, The Netherlands
    Search PM
    Originally Posted by Southstorm View Post
    MTS files are only difficult to read because they can contain various A/V types [...] hence the headaches.
    I'm not really sure what you mean here. This would explain why some players/devices can't read those files but that was not the problem I posed.

    Originally Posted by Southstorm View Post
    Regarding interlaced and progressive, this has a long history and is the cause of many problems when editing video as it's easier to edit a complete (progressive) frame than a separated (interlaced) frame. There is nothing good about interlacing. It is an old technology that has remained for backward compatibility with older systems. But since it still exists, we deal with it.
    What you say about video editing makes sense to me; surely those half-frames of an interlaced format shouldn't show up in my editor unless 'merged' (by whatever de-interlacing method) into proper frames.
    Yet the rest of what you're saying I do not really understand, although I have read many remarks in the same vein on the web. But never with an explanation. Please take into account this is the forum for newbies!
    As I understand 'interlacing' was invented to make movements seem more more fluid while using the same bandwidth and picture-resolution. It was a great improvement at the time. My point is that I cannot really see what has changed since that time. We still struggle with bandwidth when watching video on-line; all the more when everyone is moving to 'the cloud'. No internet-connection will even try to provide you with 1080x1920 movies. Or, when watching video's on a computer or tablet, we struggle with CPU/GPU power, which comes down to the same thing. Therefore, interlacing video frames still seems to me a good idea for the very same reasons it was invented.
    I know I'm asking a lot, but PLEASE explain to me WHY you think "There is nothing good about interlacing" and think it has remained with us only for backward compatibily? I mean: what are the technical advances which have made interlacing superfluous and/or undesirable?
    Mabel
    Last edited by Mabel; 9th Jan 2016 at 15:16.
    Quote Quote  
  16. Originally Posted by Southstorm View Post
    Regarding interlaced and progressive, this has a long history and is the cause of many problems when editing video as it's easier to edit a complete (progressive) frame than a separated (interlaced) frame.
    This is gibberish. An interlaced frame IS a complete frame. The only interlacing problems in editing may occur when you're trying to combine clips with mismatched field order. Indeed, de-interlacing can cause greater problems by generating blended frames that can't be undone.
    Quote Quote  
  17. Member
    Join Date
    Dec 2015
    Location
    Oregon, USA
    Search Comp PM
    Well, an interlaced frame is NOT always a complete frame. If there is high motion, you can have combing artifacts.

    The short story is, interlacing was used to reduce bandwidth on old broadcast systems. The argument for "smoother motion" is mute with the advent of newer standards. If smoother motion is needed now, a higher frame rate is used. My local broadcast system is ATSC. It allows for 1080@30p, 720@60p and other lower combinations. The old NTSC system was interlaced only and resolutions 720x480 and lower.

    The difference is night and day.

    Sent from my 831C using Tapatalk
    Quote Quote  
  18. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    .No, a straight interlaced frame is always a complete frame. I think what you're trying to say is that a telecined frame may not represent the complete frame of the original source. But in computer files, there is no such thing as a partial/incomplete frame. Such streams should rightly be considered corrupted.

    Interlacing has been carried over into all but the very newest/highest standards, so it is NOT a moot point.
    ATSC still to this day includes common support for interlacing and many stations showing HD are doing so using 1080i. Still pertinent.

    And any worthwhile editing program should know how to properly handle interlaced frames just as easily as with progressive frames.

    And there's no need to sidetrack the thread. The OP never mentioned any problems specific to interlacing.

    ********

    The way I see it, the more opened and optioned the container is, the more versatile it is for multiple uses, but the more the burden is placed on the decoding app/device to the point where there is a fracturing into tiers of capabilities. And a mismatch of those tiered apps/devices with the tiered sources is the cause for all the problems you see.

    Scott
    Last edited by Cornucopia; 9th Jan 2016 at 17:42.
    Quote Quote  
  19. Of course, the fact that MTS can contain different codecs is not the main cause. AVI, MP4, and MKV can contain a much wider array of codecs and they are much less problematic.

    I suspect the major problem is that most free, open source tools use ffmpeg as the source filter. And ffmpeg has problems with some transport streams.
    Quote Quote  
  20. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    That, plus the spec allows for different packet sizes (dependent upon application, which are not mutually compatible between devices) in the container.

    Scott
    Quote Quote  
  21. Member
    Join Date
    Dec 2015
    Location
    Amsterdam, The Netherlands
    Search PM
    Originally Posted by Cornucopia View Post
    there's no need to sidetrack the thread. The OP never mentioned any problems specific to interlacing.
    Indeed, I did not. But my original questions have been answered is well as I could have hoped for and the problem I'm still wrestling with is indeed interlacing.
    I've read various responses with much interest, although I'm not sure I really understand them. I am left with some questions, though:
    @smrpix and @Cornucopia: you say "An interlaced frame IS a complete frame." I can see this is true in the sense that two-halves make for one whole, but your comment regarded SouthStorm's contention that "this has a long history and is the cause of many problems when editing video as it's easier to edit a complete (progressive) frame than a separated (interlaced) frame.". I can not see what's wrong with SouthStorm's statement. I do not know of any video-editor able to work with the 'half-frames' of an interlaced format; as far as I know they are always de-interlaced to 'complete frames'.

    As I now understand, a number of de-interlacing methods can be used, each with specific advantages and disadvantages. I understand that some methods give better results when camera or subject are more-or-less stationary, while other methods give a better result when the camera or the subject is moving swiftly.
    This I can understand: the half-frames of an interlaced format may differ a lot when something is moving and this may well require de-interlacing methods which are different from those one would use when both half-frames were taken from a more stationary subject.
    There's a lot i do NOT understand however:
    Originally Posted by Southstorm View Post
    The short story is, interlacing was used to reduce bandwidth on old broadcast systems. The argument for "smoother motion" is mute with the advent of newer standards. If smoother motion is needed now, a higher frame rate is used. My local broadcast system is ATSC. It allows for 1080@30p, 720@60p and other lower combinations. The old NTSC system was interlaced only and resolutions 720x480 and lower.
    The difference is night and day.
    I don't get this. As I understand it, interlacing was invented to get more fluid movements within the same bandwidth. And are we not struggling with bandwidth more then ever now that we are moving from 'broadcast' to 'webcast' and even 'on demand'? Moreover, you say your local broadcast is 30p - that is, in frames per second, the same speed it always was. What has changed? Yes, the resolution has changed considerably, but what has that got to do with interlacing? You say "If smoother motion is needed now, a higher frame rate is used" but where is that happening? And why would 'smoother motion' NOT be needed now if it was desirable in the past?

    Thus, I still fail to see why interlacing is outmoded. Yet it seems to me that computers often struggle to play interlaced video. I gather this is (at least partly) because the media-player has to de-interlace the video 'real-time' before showing it. That would at least explain why media-players and CPU's are struggling, but it doesn't explain why this de-interlacing is necessary in the first place.
    Mabel
    Quote Quote  
  22. Originally Posted by Mabel View Post
    I can see this is true in the sense that two-halves make for one whole, but your comment regarded SouthStorm's contention that "this has a long history and is the cause of many problems when editing video as it's easier to edit a complete (progressive) frame than a separated (interlaced) frame.". I can not see what's wrong with SouthStorm's statement. I do not know of any video-editor able to work with the 'half-frames' of an interlaced format; as far as I know they are always de-interlaced to 'complete frames'.
    The statement is simply wrong. It's wrong about how editors work, it's wrong about interlaced video being "half" frames. It's wrong about interlaced being more difficult to edit than progressive. It's wrong. Ignore it.

    Originally Posted by Mabel View Post
    I don't get this. As I understand it, interlacing was invented to get more fluid movements within the same bandwidth. And are we not struggling with bandwidth more then ever now that we are moving from 'broadcast' to 'webcast' and even 'on demand'? Moreover, you say your local broadcast is 30p - that is, in frames per second, the same speed it always was. What has changed? Yes, the resolution has changed considerably, but what has that got to do with interlacing? You say "If smoother motion is needed now, a higher frame rate is used" but where is that happening? And why would 'smoother motion' NOT be needed now if it was desirable in the past?
    Bandwidth was only one of several factors -- among the others were flickering, warping and cyclical noise.

    Originally Posted by Mabel View Post
    Thus, I still fail to see why interlacing is outmoded. Yet it seems to me that computers often struggle to play interlaced video. I gather this is (at least partly) because the media-player has to de-interlace the video 'real-time' before showing it. That would at least explain why media-players and CPU's are struggling, but it doesn't explain why this de-interlacing is necessary in the first place.
    Mabel
    Deinterlacing is a relatively small load on the CPU in the overall decompression scheme of modern codecs. That said, progressive frames are more compressible because there is less motion complexity.

    I applaud you for thinking about these things, and encourage you to learn more about them.
    Quote Quote  
  23. It is 2016. A New Year is upon us, and it is probably worth contemplating the future even if that means reflecting on the past.

    The only group of which I am aware that still champions interlaced over progressive content are broadcasters and for one, simple, unrelated reason: $$$$ (is there ever really any other reason?).

    But, we live in a native progressive world. Nothing about the biology of our eyeballs is interlaced. Modern computer monitors are native progressive. Modern TVs are native progressive. Movies aren't interlaced. Neither are Youtube or photographs; and while the list goes on and on, more and more content is being authored as native progressive. I see no benefit of carrying the broadcaster torch other than I guess it makes you sound intelligent because interlacing is arcana. I suspect interlacing will go the way of the dodo soon and not be remembered fondly by anyone. Many countries have already completely stopped broadcasting interlaced content.

    In the meantime, I de-interlace all my content. And I mean ALL of it. SD, HD, it doesn't matter. I won't bring footage into an NLE unless I have first pushed it though an Avisynth script with QTGMC(). Long live QTGMC! While de-interlacing SD content is smooth as butter (especially multi-threaded), HD content is harder. If I ever happen upon interlaced 4K content, I will simply press the delete key.
    Quote Quote  
  24. Member
    Join Date
    Dec 2015
    Location
    Oregon, USA
    Search Comp PM
    Originally Posted by SameSelf View Post
    It is 2016. A New Year is upon us, and it is probably worth contemplating the future even if that means reflecting on the past.

    The only group of which I am aware that still champions interlaced over progressive content are broadcasters and for one, simple, unrelated reason: $$$$ (is there ever really any other reason?).

    But, we live in a native progressive world. Nothing about the biology of our eyeballs is interlaced. Modern computer monitors are native progressive. Modern TVs are native progressive. Movies aren't interlaced. Neither are Youtube or photographs; and while the list goes on and on, more and more content is being authored as native progressive. I see no benefit of carrying the broadcaster torch other than I guess it makes you sound intelligent because interlacing is arcana. I suspect interlacing will go the way of the dodo soon and not be remembered fondly by anyone. Many countries have already completely stopped broadcasting interlaced content.

    In the meantime, I de-interlace all my content. And I mean ALL of it. SD, HD, it doesn't matter. I won't bring footage into an NLE unless I have first pushed it though an Avisynth script with QTGMC(). Long live QTGMC! While de-interlacing SD content is smooth as butter (especially multi-threaded), HD content is harder. If I ever happen upon interlaced 4K content, I will simply press the delete key.
    I wholeheartedly agree.

    Sent from my 831C using Tapatalk
    Quote Quote  
  25. Originally Posted by SameSelf View Post
    But, we live in a native progressive world. Nothing about the biology of our eyeballs is interlaced.
    If your claim will be correct then interlace will be not usable - from biology (human psychovisual - perception) perspective, interlace is a clever way to reduce signal bandwidth by half and still provide good compromise between temporal and spatial resolution. This is very first video signal compression.

    Remember that video signal foundation was created before electric computer existence (even relay computers was created decade later than video), most advanced semiconductor device in the beginning of the video era was natural semiconductor diode based on galena crystals (with cat's whiskers junction https://en.wikipedia.org/wiki/Cat%27s-whisker_detector ) - be amazed how big discovery is interlace that it was able to survive almost 100 years - this is not accident but outcome of real life - interlace is simply very well matched with our eye-bulb and cortex capabilities...
    Quote Quote  
  26. Originally Posted by pandy View Post
    If your claim will be correct then interlace will be not usable - from biology (human psychovisual - perception) perspective, interlace is a clever way to reduce signal bandwidth by half and still provide good compromise between temporal and spatial resolution. This is very first video signal compression.

    Remember that video signal foundation was created before electric computer existence (even relay computers was created decade later than video), most advanced semiconductor device in the beginning of the video era was natural semiconductor diode based on galena crystals (with cat's whiskers junction https://en.wikipedia.org/wiki/Cat%27s-whisker_detector ) - be amazed how big discovery is interlace that it was able to survive almost 100 years - this is not accident but outcome of real life - interlace is simply very well matched with our eye-bulb and cortex capabilities...
    What does that have to do with the price of tea in China? I will concede that interlacing is a "clever way to reduce bandwidth." So is 4:2:0 vs 4:4:4, 320x240 vs UHD, and 1.5 Mbps vs lossless. But these examples just reinforce how human perception is limited and therefore easily fooled. They say nothing about how it functions.

    Regarding technology precedence and cats, I am going to go out on a limb and state, in reference to the chicken or the egg, that progressive came first.
    Quote Quote  
  27. Originally Posted by SameSelf View Post
    What does that have to do with the price of tea in China?
    I can address this question if you can enlighten me direct link between interlace and tea and China. In other case i would consider this as a reductio ad absurdum https://en.wikipedia.org/wiki/Reductio_ad_absurdum .

    Originally Posted by SameSelf View Post
    I will concede that interlacing is a "clever way to reduce bandwidth." So is 4:2:0 vs 4:4:4, 320x240 vs UHD, and 1.5 Mbps vs lossless. But these examples just reinforce how human perception is limited and therefore easily fooled. They say nothing about how it functions.

    Regarding technology precedence and cats, I am going to go out on a limb and state, in reference to the chicken or the egg, that progressive came first.
    Once again you are confused - 4:2:0 and 4:4:4 are related to discrete (usually digital) systems - in analogue world bandwidth reduction is reached by filtering and such process is applied to luminance and chrominance signals - direct outcome of this is a mentioned 4:n:n .
    I don't feel that discussing process of the chicken and egg production will lead us to interlace.
    Once again - my point is: interlace is a very well matched with human visual perception system and it can be considered as lossy compression. I never said that interlace is better than progressive.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!