VideoHelp Forum




+ Reply to Thread
Page 2 of 2
FirstFirst 1 2
Results 31 to 49 of 49
  1. Member LSchafroth's Avatar
    Join Date
    Dec 2002
    Location
    United States
    Search Comp PM
    Originally Posted by lordsmurf View Post
    Deinterlacing improves with every generation. If you hard deinterlace the source, it's going to look that way forever. And unless you're running something like QTGMC, or even a proprietary method from Snell & Willcox or Faroudja, then what you're doing is guaranteed to look like crap.
    I agree to a point. When you say it will look like crap, that is not always the case. The videos I have done look ok, but not that great when left interlaced. Once progressive they look much, much, much better on the TV.

    But, in your defense I have always used Pro Coder so I wonder if that has been the problem all along. I am going to finish up testing a interlaced file with Mainconcept and see how it looks tonight. Ran out of time to test it the past few days but the final result on true interlaced footage will be tonight.

    If it turns out fantastic then I will be back to eat my words on this reply. lol

    Lannie
    Quote Quote  
  2. In addition to what lordsmurf said, deinterlacing as it's usually done, (as for DVD, for example) loses half the temporal information. What was originally a picture sampled 59.94 times per second (for NTSC) now gets sampled 29.97 times per second. The picture begins to play very slightly jerky and less smoothly. A progressive TV set will bob the original interlaced video to give you 59.94 full and unique frames per second. Although a case can be made (and I've made it myself on occasion) that a software deinterlacer can do a pretty decent job of deinterlacing (and take a long time doing it), TV (and DVD player) deinterlacers (as lordsmurf said) are often very good and only get better with time.
    If the only TVs sold today are progressive devices, doesn't that mean that anyone who uses a TV to watch - for example - captured VHS tapes (interlaced, by definition)...
    If it's NTSC and the tape is of a film, then it can (and should) be IVTC'd to return the original 23.976fps. But an IVTC is a fundamentally different proposition from a deinterlace.
    ...why does it seem to have become a taboo in some quarters that interlaced source-material must never be deinterlaced?
    'Never' is a strong word. If you're planning on uploading something to YouTube, for example, you had best make it progressive one way or another,
    Last edited by manono; 30th Oct 2012 at 13:08.
    Quote Quote  
  3. Originally Posted by lordsmurf View Post
    Deinterlacing for TV is fixing something that wasn't broken to begin with. (And said "fix" is actually breaking it.)
    Hi

    Coming from a pro, to me that's a very reassuring assertion - because it's just what (in my bumbling way) my gut-feeling was telling me. However, it isn't what comes across from much of what one reads.

    To give another example, there's a thread on this forum (which I now can't find) where people get on their high horses about the precise meaning of "lossless" compression. Everything they say is doubtless technically correct, but what use is "losslessness" except in the unlikely event one wants to go back again to the (uncompressed) original capture rather than going forwards - and why would anyone want to do that? Assuming that the real-life aim is to convert the original, even I can see (and I have the ocular equivalent of cloth ears), a clear difference between the input pane in VideoDub and the output pane after applying Huffyuv compression. It's of only academic interest that "Huffyuv is lossless"; so what? It subtly changes the original (for the worse) and that's all that matters.
    Last edited by Kosketus; 31st Oct 2012 at 04:31.
    Quote Quote  
  4. Originally Posted by Kosketus View Post
    but what use is "losslessness" except in the unlikely event one wants to go back again to the (uncompressed) original capture rather than going forwards
    Very useful. If, for example, you're doing a several stage 'restoration' of your capture before your final conversion of it, you'll want to use a lossless format for those intermediate stages.
    It subtly changes the original (for the worse) and that's all that matters.
    Not if you're not converting color spaces. In that case the input and output panes should be identical. That's what 'lossless' means.
    Quote Quote  
  5. Originally Posted by Kosketus View Post
    To give another example, there's a thread on this forum (which I now can't find) where people get on their high horses about the precise meaning of "lossless" compression. Everything they say is doubtless technically correct, but what use is "losslessness" except in the unlikely event one wants to go back again to the (uncompressed) original capture rather than going forwards - and why would anyone want to do that? Assuming that the real-life aim is to convert the original, even I can see (and I have the occular equivalent of cloth ears), a clear difference between the input pane in VideoDub and the output pane after applying Huffyuv compression. It's of only academic interest that "Huffyuv is lossless"; so what? It subtly changes the original (for the worse) and that's all that matters.
    I'm sorry, but you are wrong here. HuffYUV does not change the original if it is in YUY2 form (what most capture cards capture). The difference you see in the input and output panes of VirtualDub is because the input pane is using DirectX to display the video and the output pane is using Windows GDI -- and your graphics card's video proc amp (used by DX) is set improperly. Turn off the DirectX option in VirtualDub and both will look identical (assuming you are not applying any filters).
    Quote Quote  
  6. Thanks for setting me straight, too. I hardly ever use the output window for anything and probably shouldn't have addressed that part of the question.
    Quote Quote  
  7. Member LSchafroth's Avatar
    Join Date
    Dec 2002
    Location
    United States
    Search Comp PM
    The verdict is in. Original interlaced footage looks terrible even with MC encoding. Deinterlaced via yadif and converted with MC as progressive looks fantastic.

    Lannie
    Quote Quote  
  8. Originally Posted by LSchafroth View Post
    The verdict is in. Original interlaced footage looks terrible even with MC encoding. Deinterlaced via yadif and converted with MC as progressive looks fantastic.
    That's because your player or TV has poor deinterlacing (or maybe you made the video wrong). What about the next player or TV you get? It may do better than Yadif. Then you'll be stuck with inferior Yadif deinterlacing.
    Quote Quote  
  9. Member LSchafroth's Avatar
    Join Date
    Dec 2002
    Location
    United States
    Search Comp PM
    I work at a school and had access to 4 different tv's and 3 different players not counting mine at home. All displayed exactly the same way. I made the exactly as shown and adviced with settings given to me. Tried a different field order just in case. no difference.
    Progressive looks fantastic so that will be the method I choose. I awlays keep a master of thier interlaced footage as a backup if ever needed.

    I've been told by many people oh it must be youru tv or your player. Yet, I've tried all their suggestions on settings, conversions, different programs and they all produce the same look. Bad. When deinterlaced, it looks great. Smooth video.


    Lannie
    Quote Quote  
  10. Member LSchafroth's Avatar
    Join Date
    Dec 2002
    Location
    United States
    Search Comp PM
    Thanks for everyone's help with MC. I really like it. The settings you gave me were very close to the default settings the DVD template was set to.

    Lannie
    Quote Quote  
  11. Originally Posted by jagabo View Post
    HuffYUV does not change the original if it is in YUY2 form (what most capture cards capture). The difference you see in the input and output panes of VirtualDub is because the input pane is using DirectX to display the video and the output pane is using Windows GDI -- and your graphics card's video proc amp (used by DX) is set improperly. Turn off the DirectX option in VirtualDub and both will look identical (assuming you are not applying any filters).
    Hi jagabo

    Many thanks - that's most useful to know. It's been really bothering me and moreover the explanation is one I'd never have sussed-out for myself in a million years! My capture-card does - I believe - capture in YUY2 (it has a bt878 chip, and I'm using the btWin driver).

    I will next follow the experiment you advise (not applying any filters, which I wasn't anyway to begin-with), and shall only comment further if the result is other than you predict.
    Last edited by Kosketus; 5th Nov 2012 at 07:22.
    Quote Quote  
  12. I have tested HuffYUV many times. Using software which could detect a single bit difference between two videos. There has never been any difference after encoding. Ie, the YUY2 that comes out of HuffYUV is exactly the same as the YUY2 that went it. All my tests were run with HuffYUV v2.1.1, 32 bit.

    There are several versions of Huffyuv around with slightly different algorithms. So there's always the possibility that using one version to encode and another to decode will give differing results. But in my experience that usually leads to grossly distorted video or crash-and-die problems. And there's always the possibility that there's a bug in any particular build of Huffyuv.
    Quote Quote  
  13. Member 2Bdecided's Avatar
    Join Date
    Nov 2007
    Location
    United Kingdom
    Search Comp PM
    Look: if your source really is interlaced video footage, and the deinterlaced version on DVD looks fantastically better than the interlaced version, something is wrong.

    If the deinterlaced version also looks "smoother" (or even as-smooth) in terms of movement, then your TV is interpolating frames to give smooth movement. Deinterlaced video on DVD is not smooth. That same deinterlaced version will stutter like mad on any TV that shows it as it really is - i.e. any TV that doesn't have frame interpolation (or has frame interpolation turned off - most movie buffs switch it off to stop movies looking like soap operas).

    The only "advantage" I've noticed is that progressive (deinterlaced) content can be encoded with fewer MPEG artefacts than interlaced content, everything else being equal. But the deinterlaced result is stuttery/blurry compared with the interlaced original, so the lack of MPEG artefacts isn't a great advantage.

    Some people like the stuttery look, because they associate it with movies. I've never found this convincing with video sources (it always looks like crap to me!) - and anyway, it clearly looks different from the original, which isn't usually wanted.


    If your source isn't really interlaced, then deinterlacing it is inappropriate, but it might not hurt it. e.g. it might be doing a mediocre IVTC job that makes the output look OK, or it might be doing very little at all.

    Cheers,
    David.
    Quote Quote  
  14. @LSchafroth I don't understand how your deinterlaced video can look better than what your TV does. Reason being is that the best deinterlacing method doubles the framerate, however those framerates are not supported on DVD. They're not even supported on Blu-ray. The only way to take advantage of this method on 576i25 (or NTSC equivalent) footage is to upscale, so you end up with a valid 720p50 Blu-ray spec.

    Now consider that your TV already does what I just explained when playing interlaced video. In which case I'll echo 2Bdecided - something is wrong. The quality does vary depending on the TV, but unless you're uploading to the web, I'd say keep your footage interlaced. If playing it on a computer then you can use VLC which has good realtime deinterlacing options.
    Quote Quote  
  15. Banned
    Join Date
    Oct 2004
    Location
    New York, US
    Search Comp PM
    Originally Posted by LSchafroth View Post
    The verdict is in. Original interlaced footage looks terrible even with MC encoding. Deinterlaced via yadif and converted with MC as progressive looks fantastic.

    Lannie
    Wrong. You need a better TV. Even a good a/v receiver with AnchorBay processing would look better.
    Last edited by sanlyn; 24th Mar 2014 at 12:32.
    Quote Quote  
  16. Originally Posted by sanlyn View Post
    Wrong. You need a better TV.
    My guess is he just doesn't know what he's doing. Hard to tell without samples and settings and the like, though.
    Quote Quote  
  17. Banned
    Join Date
    Oct 2004
    Location
    New York, US
    Search Comp PM
    True. I'd wager the O.P. could have a video that has interlace problems of some sort to begin with. With no sample, we'll never know.
    Last edited by sanlyn; 24th Mar 2014 at 12:32.
    Quote Quote  
  18. Member LSchafroth's Avatar
    Join Date
    Dec 2002
    Location
    United States
    Search Comp PM
    there is no point in posting samples when it happens on every interlaced footage I've ever tried. Different cameras, different capture devices, different computers. All of the above produce the same ugly result. deinterlaced footage it terrific. Retarded answers like get a better TV and etc dont help so why bother. I have the results I want with MC and progressive video.
    Quote Quote  
  19. Originally Posted by LSchafroth View Post
    I have the results I want with MC and progressive video.
    Yes, the results inferior to what you might have. Samples, in spite of what you think, would be useful, very useful. Samples from the source and from your final DVD. We may be able to offer advice as to what you're doing wrong. And you are doing something wrong if interlaced DVDs look so bad on your television. Have you even said what looks wrong about your interlaced encoded footage (I don't feel like going back and rereading the whole thread)?
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!