VideoHelp Forum




+ Reply to Thread
Page 2 of 3
FirstFirst 1 2 3 LastLast
Results 31 to 60 of 72
  1. Member
    Join Date
    May 2014
    Location
    Memphis TN, US
    Search PM
    Originally Posted by newpball View Post
    Really, so you would advice everybody to watch their interlaced material on CRT devices?
    Sure. I still have one. And newer stuff designed to handle it the right way. Why not? Aren't your players and TV's designed for it?
    - My sister Ann's brother
    Quote Quote  
  2. Originally Posted by newpball View Post
    Originally Posted by Mr Chris View Post
    Those "interlacing effects" you mention are not visible if you view the vids on a TV - go ahead and plug your DV camcorder into yours and check if you like - so deinterlacing to fix those "lines" could be seen as attempting to mend a problem that simply isn't there in the first place.
    Really interlaced video has no problems? And just because those problems are not (readily) visible on some archaic viewing screen does not mean they don't exist.
    You're muddying the waters unnecessarily. The OP is talking about material shot on a DV camcorder, and how it plays back at apparently half the frame rate and with interlacing lines on a computer monitor. I think this is due to either no deinterlacing during playback, or the wrong type - and I stand by my subsequent advice.

    (We all know interlaced video has its foibles, but in this case I'm convinced it's not an issue.)
    Quote Quote  
  3. Member
    Join Date
    May 2014
    Location
    Memphis TN, US
    Search PM
    I would advise them to watch their interlaced videos on whatever they have that can handle different video formats properly. You don't need a CRT for interlaced video, and offhand I don't even know where one could buy a good one.
    - My sister Ann's brother
    Quote Quote  
  4. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by LMotlow View Post
    Originally Posted by newpball View Post
    Really, so you would advice everybody to watch their interlaced material on CRT devices?
    Sure.
    Click image for larger version

Name:	vids.jpg
Views:	184
Size:	178.2 KB
ID:	29145

    Quote Quote  
  5. Member
    Join Date
    May 2014
    Location
    Memphis TN, US
    Search PM
    I answered the question twice, but you didn't answer mine. You just repeated yours like you didn't understand what I said. That's a funny image. Is that you? I'm betting you keep an old browned-out CRT around just for interlaced videos.
    - My sister Ann's brother
    Quote Quote  
  6. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by LMotlow View Post
    I answered the question twice, but you didn't answer mine. You just repeated yours like you didn't understand what I said. That's a funny image. Is that you?
    Well we seem to have a fundamental difference of opinion here. I think that if we deliver video to a modern device we should deinterlace it before we encode it, you seem to be of the opinion that it is the device's problem. Almost all video delivered though the internet is deinterlaced before being encoded, you would like to take a step back and change that?

    It looks like we missed the chance again with H 265, that standard should simply have disallowed interlaced video, that would have put an end to this dreadful situation.
    Quote Quote  
  7. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by LMotlow View Post
    I'm betting you keep an old browned-out CRT around just for interlaced videos.
    Oh yes, I still have one of these and it is still working:



    Quote Quote  
  8. Keep in mind that though some are better than others there is no perfect deinterlacer and there never will be. If you deinterlace your video before encoding you will be locking in the quality of the deinterlacer you use now. TVs (and other playback devices) will continue to improve on the quality of their deinterlacing. So some day your TV may do better than you can do now in software.

    Also, if your playback device doesn't support 50 or 60 fps playback, so you deinterlace to 25 or 30 fps, you will lose half of the motion smoothness. That's pretty bad for shaky, handheld camcorder video.
    Quote Quote  
  9. Member
    Join Date
    May 2014
    Location
    Memphis TN, US
    Search PM
    Originally Posted by newpball View Post
    Well we seem to have a fundamental difference of opinion here. I think that if we deliver video to a modern device we should deinterlace it before we encode it, you seem to be of the opinion that it is the device's problem.
    A good player or TV can deinterlace better than you can, even with QTGMC. Maybe you should trade in your bottom of the line Dynex and no-name BD player and get something that does it right. Would save you a lot of trouble. I'm not about to go through all that with the 4000-plus movies and tv shows in my collection, all of which display very nicely on my plasma, my LCD, and my old CRT in the bedroom, and my HTPC.

    Originally Posted by newpball View Post
    Almost all video delivered though the internet is deinterlaced before being encoded, you would like to take a step back and change that?
    Yeah, I know. Nope. Most of it is low-bitrate or poorly processed crap that isn't worth a dime. I buy good retail copies of stuff I like and don't like paying for it more than once. I'm on cable and record a lot of it. The internet sucks for movies IMO.

    Originally Posted by newpball View Post
    It looks like we missed the chance again with H 265, that standard should simply have disallowed interlaced video, that would have put an end to this dreadful situation.
    Maybe you need to get your movement better organized. I'm certain the future will bring improvements, as well as more general lowering of original film quality for toys like iphones and mp3 compression. Meanwhile I'll shock the hell out of you and tell you I have 300 vinyl record albums and the equipment to play 'em the way they should be played, without all that CD contamination and digital rounding. And a premium CD player and an outboard audio DAC to make those pesky CD's behave. I think I have a decent representation of both old and new and the gear to handle it. When the future arrives, I'll be catching up like everyone else. But I won't ditch what still works. If new technology delivers an improvement, I'm all for it. If it doesn't, I won't ante up just because it's new. I'd rather do without. So far, it's all been working pretty well.
    Last edited by LMotlow; 15th Dec 2014 at 13:47.
    - My sister Ann's brother
    Quote Quote  
  10. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by jagabo View Post
    Keep in mind that though some are better than others there is no perfect deinterlacer and there never will be. If you deinterlace your video before encoding you will be locking in the quality of the deinterlacer you use now.
    Good point, I would obviously only deinterlace a delivery codec not an archive.

    However one can deinterlace lossless.

    Originally Posted by jagabo View Post
    TVs (and other playback devices) will continue to improve on the quality of their deinterlacing. So some day your TV may do better than you can do now in software.
    I hope not as it would be a tremendous waste of money. Deinterlacing in the studio and deliver material deinterlaced would obviously be much more economic.
    Last edited by newpball; 15th Dec 2014 at 13:47.
    Quote Quote  
  11. Member
    Join Date
    May 2014
    Location
    Memphis TN, US
    Search PM
    Uh, so whatever happened to MiniDV_newb?
    - My sister Ann's brother
    Quote Quote  
  12. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by LMotlow View Post
    Meanwhile I'll shock the hell out of you and tell you I have 300 vinyl record albums and the equipment to play 'em the way they should be played, without all that CD contamination and digital rounding. And a premium CD player and an outboard audio DAC to make those pesky CD's behave. I think I have a decent representation of both old and new and the gear to handle it.
    That is fine but why would you be in favor for a brand new codec to allow ancient technology? Re-encoding MPEG-2 to H265 obviously won't make things any better so what purpose would an H265 interlaced video have?
    Quote Quote  
  13. Member
    Join Date
    May 2014
    Location
    Memphis TN, US
    Search PM
    A brand new codec might decode "better" than the previous version -- fewer rounding errors, better interpolation, whatever. Most people don't throw away everything they own when a new product hits the shelves. Classic films and famous musical recordings will be around for a long time, and so will an audience for them. Better technology = better restorations. If you don't think Leopold Stokowski's brilliant 1945 recording of Beethoven's Third or Charlie Chaplin's City Lights are of any cultural importance for a great many people world-wide because they occurred a long time ago, you're missing out. Those events will take on greater clarity and power with better technology. So will somebody's 2001 family videos on Mini-DV.

    Who says video will always be interlaced in the future? We're talking about now, which is what we have to work with. A lot of new formats aren't interlaced.
    Last edited by LMotlow; 15th Dec 2014 at 14:12.
    - My sister Ann's brother
    Quote Quote  
  14. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by LMotlow View Post
    Most people don't throw away everything they own when a new product hits the shelves.
    Right, so H265 is not going to make a difference there! If a studio decides to reissue material in H265 it is most certainly not going to work on crummy old CRT devices. Or perhaps we should put the horse completely behind the carriage and mandate that H265 to RF converters be made available so that, right, we can still watch this on crummy old CRT devices?

    Originally Posted by LMotlow View Post
    Classic films and famous musical recordings will be around for a long time, and so will an audience for them. Better technology = better restorations.
    I see, so you would like to deliver this in interlaced H265 format in the future? No, then why insist that H265 supports interlaced video?

    Originally Posted by LMotlow View Post
    If you don't think Leopold Stokowski's brilliant 1945 recording of Beethoven's Third or Charlie Chaplin's City Lights are of any cultural importance for a great many people world-wide because they occurred a long time ago, you're missing out. Those events will take on greater clarity and power with better technology. So will somebody's 2001 family videos on Mini-DV.
    Fully agreed, they will, but not with interlaced technology.
    Quote Quote  
  15. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    I disagree with your idea that deinterlacing in the studio would be more economical: time is money. So if it takes extra time to deinterlace before encoding, it costs more money. Who's going to pay for that?

    Also, there are literally 10s of thousands of programs from the 5 SD-based decades preceding, and 1 1/2 decades of HD to the very present which contain interlaced-sourced material (not talking about progressive, film-sourced stuff). Don't they get a shot at being stored in an "efficient" codec, too, without having to be fully reworked? It's about respecting history & honoring legacy material.
    But if you are throwing out the old in such a cavalier way and enforcing the latest & greatest, why stop there? Shouldn't ALL footage be mandated to be converted to HFR, HDR, Rec2020 (or ACES), 32bit float per color channel, 4D?
    If you recognize that as a extreme exaggeration, it was to prove a point: where do you draw the line? And who is drawing it? And why is that line the one chosen? Who has the right to choose for others?
    These are the kinds of decisions of import that require slower, careful steps, not racehorse starts.

    Nobody's stopping YOU from deinterlacing, nor from using HEVC! The rest of us don't all need to be in lock-step with that yet.

    Scott
    Quote Quote  
  16. Member
    Join Date
    May 2014
    Location
    Memphis TN, US
    Search PM
    Not only are you reading meanings that aren't in what I posted, but you're beating a dead horse. I'm sorry to hear that h265 still supports interlaced video. My old CRT will be used in the bedroom until it dies, it's a nice little TV with beautiful color, spent a lot of time in its service menu calibrating with a colorimeter. If h265 won't work on it, so be it, and some day the critter will bite the dust and nothing will work on it. I'll move on, like everyone else. Unlike many techno freaks nowadays, I pay attention to the content and the artistry, as long as it renders properly. I don't "watch" hardware, I watch what its playing. New tech has already hampered the film experience in movie theaters with that dried up digital conversion crap, so I have to take in film quality when I visit the MOMA in New York -- and they're going digital as well, crippling the original visual dynamics of so many old movies.

    I guess you'll have to live with whatever the engineers decide about h265, and so will the rest of us. You can spend your time deinterlacing and recompressing videos, which does more damage IMO than playing the original properly with the right equipment. That's up to you. I prefer whenever I can to keep originals intact unless they're just poor work to begin with and need some fixup.

    Have you seen anyone about this thing you have with interlaced video? Seriously, not trying to be funny. If you really want to get involved, you should be in depth studying the engineering behind it, not just sitting and watching it, if you feel that strongly about it.
    - My sister Ann's brother
    Quote Quote  
  17. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    He's got a thing against MPEG2 and engineers also, it seems.

    Scott
    Quote Quote  
  18. Member
    Join Date
    May 2014
    Location
    Memphis TN, US
    Search PM
    I can understand that, even if I don't agree with it (not entirely). Have a young nephew who won't watch old movies because they're in black and white and aren't wide screen. He has no other reason, really. I'm trying to imagine Citizen Kane in 16:9 Technicolor, which makes me wait for the singing and dancing to start, or maybe a car chase. Just doesn't work for me. But that's the way some people are.
    - My sister Ann's brother
    Quote Quote  
  19. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by Cornucopia View Post
    Also, there are literally 10s of thousands of programs from the 5 SD-based decades preceding, and 1 1/2 decades of HD to the very present which contain interlaced-sourced material (not talking about progressive, film-sourced stuff). Don't they get a shot at being stored in an "efficient" codec, too, without having to be fully reworked? It's about respecting history & honoring legacy material.
    Could you explain to me how one 'respects history and honors legacy' by using lossy encoding on top of footage which was already lossy encoded?
    Quote Quote  
  20. Member
    Join Date
    May 2014
    Location
    Memphis TN, US
    Search PM
    What are you talking about? Maybe we should decode them to lossless media and ship or broadcast them that way? You could re-engineer the bandwidth requirements for doing that and the hardware to work it out.

    Isn't a lot of that stuff on pro-grade analog tape, never encoded?
    - My sister Ann's brother
    Quote Quote  
  21. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Originally Posted by newpball View Post
    Originally Posted by Cornucopia View Post
    Also, there are literally 10s of thousands of programs from the 5 SD-based decades preceding, and 1 1/2 decades of HD to the very present which contain interlaced-sourced material (not talking about progressive, film-sourced stuff). Don't they get a shot at being stored in an "efficient" codec, too, without having to be fully reworked? It's about respecting history & honoring legacy material.
    Could you explain to me how one 'respects history and honors legacy' by using lossy encoding on top of footage which was already lossy encoded?
    You are mistaken: much of that was not "lossy encoded". Lossy digital encoding didn't really appear in video masters until the 90's. Most of that previous is either ANALOG (2" Quad, 1"B, 1"C, 3/4"Umatic, M/MII, Betacam/SP, etc.), or D1 or D2. ALL of that is interlaced.

    Scott
    Quote Quote  
  22. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by LMotlow View Post
    What are you talking about? Maybe we should decode them to lossless media and ship or broadcast them that way?
    Apart from the fact that that does not make any sense, it does make sense to decode and reencode as long as we not, heaven forbid, deinterlace it, because that would be disrespecting the sources? Am I the only one who sees holes in this logic? So reencoding delivery codecs is just fine but deinterlacing is one of the 7 deadly sins?

    Perhaps purist should not deinterlace at all and watch interlacing artifacts on their progressive scan devices, can't get any closer to the original than that! Respect +1

    Last edited by newpball; 15th Dec 2014 at 15:26.
    Quote Quote  
  23. Member
    Join Date
    May 2014
    Location
    Memphis TN, US
    Search PM
    That's what I thought. Likely the broadcast digital masters are of much higher quality than consumer material.
    - My sister Ann's brother
    Quote Quote  
  24. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by Cornucopia View Post
    You are mistaken: much of that was not "lossy encoded". Lossy digital encoding didn't really appear in video masters until the 90's. Most of that previous is either ANALOG (2" Quad, 1"B, 1"C, 3/4"Umatic, M/MII, Betacam/SP, etc.), or D1 or D2. ALL of that is interlaced.
    That's fine so let me get this straight, if it is delivered in MPEG-2 or H265 interlaced it is alright and respects the heritage yada yada, but, heaven forbid, if it is delivered deinterlaced, that is a clear sign of disrespect? It that the gist?
    Quote Quote  
  25. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Depends on what else is done (or not) to it, but yeah, there's nothing inherently good or bad about MPEG2 or AVC or H265. Lots depends on the profile/level, subsampling, and on the bitrate. Just look at MPEG4 SStP SR Masters, considered by many to be near (if not AT) the top of the line of quality in compressed video, or possibly ALL video. And it supports interlaced.

    Scott

    ...I seriously don't get what your gripe is with interlacing and with lossy compression. Sure, all things being equal, progressive is better, but very often all things are not equal. Same with lossy compression.
    And there's lossy compression and then there's LOSSY compression. It's a spectrum. AT the top end of that spectrum, it is basically equivalent to lossless. At the bottom end, nothing you do is going to make it look good, deinterlacing or not.

    But I see where you are going with the "yada, yada". That is very telling indeed.
    Last edited by Cornucopia; 15th Dec 2014 at 15:30.
    Quote Quote  
  26. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by Cornucopia View Post
    ...I seriously don't get what your gripe is with interlacing and with lossy compression.
    My gripe with interlaced is that it gives tons of problems. Even on this forum about half of the reported questions/problems is about interlaced videos.

    I do not have an issue with lossy compression at all, not sure why you conclude that. Obviously to deliver video you have to consider bandwidth and/or storage limitations and thus find a suitable compression format for delivery.

    However for archiving purpose I think it is ludicrous to use lossy compression, there is absolutely no need. The price per GB on hard drives is now ridiculously low. People who want to go through great lengths in presumably preserving and improving those long lost moving images from their ancestors capturing it in some crappy delivery codec and complaining that the video takes more space than their Facebook selfies. It is just unfathomable to me.
    Quote Quote  
  27. Originally Posted by LMotlow View Post
    A good player or TV can deinterlace better than you can
    That's not true. There's no magic in TVs. Everything they do is either done in software or with hardware who's algorithms were first developed in software. I haven't seen anything on a TV as good as QTGMC (in general, obviously special cases exist). TVs don't have the luxury of spending a second deinterlacing each frame. They have to display 60 frames per second and can only perform minimal processing.
    Quote Quote  
  28. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by jagabo View Post
    There's no magic in TVs.
    Yes, it all went away when the hue knob was removed.

    Quote Quote  
  29. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Originally Posted by newpball View Post
    Originally Posted by Cornucopia View Post
    ...I seriously don't get what your gripe is with interlacing and with lossy compression.
    My gripe with interlaced is that it gives tons of problems. Even on this forum about half of the reported questions/problems is about interlaced videos.

    I do not have an issue with lossy compression at all, not sure why you conclude that. Obviously to deliver video you have to consider bandwidth and/or storage limitations and thus find a suitable compression format for delivery.

    However for archiving purpose I think it is ludicrous to use lossy compression, there is absolutely no need. The price per GB on hard drives is now ridiculously low. People who want to go through great lengths in presumably preserving and improving those long lost moving images from their ancestors capturing it in some crappy delivery codec and complaining that the video takes more space than their Facebook selfies. It is just unfathomable to me.
    It only gives a ton of problems to people who don't understand or respect it. Very often, the best course of action WRT interlaced footage is to let it stay interlaced throughout, and let the player/display do the deint (if even necessary). Yet people are intent upon disregarding that axiom and plow through with editing, processing & compression without regard to its existence or maintenance. THAT'S when they have problems. But people also often come to this site because of goofy things they have done to their framerates (VFR) or to their subsampling, or via their re-re-recompressions. Precisely because they don't know better and aren't respecting the rules of video (and data, and engineering/physics).

    Archiving is probably another tripartite-like constraint (much like Fast-Cheap-Good): There's [Quality]-[Longevity]-[Economy/Ease] (remembering that time/effort=money) and people choose their varying priorities. Yours may differ from some others'.

    Scott
    Last edited by Cornucopia; 21st Dec 2014 at 00:26.
    Quote Quote  
  30. Member
    Join Date
    May 2014
    Location
    Memphis TN, US
    Search PM
    Originally Posted by jagabo View Post
    Originally Posted by LMotlow View Post
    A good player or TV can deinterlace better than you can
    That's not true. There's no magic in TVs. Everything they do is either done in software or with hardware who's algorithms were first developed in software. I haven't seen anything on a TV as good as QTGMC (in general, obviously special cases exist). TVs don't have the luxury of spending a second deinterlacing each frame. They have to display 60 frames per second and can only perform minimal processing.
    No, I'm not saying they're magic, and some are clearly not better at it than others (I had one of the "not better ones" and got rid of it, back to the store lickety-split). It would take at least magic to clean up some of the dirty low-bitrate junk thru cable that QTGMC can often deal with hands-down. Yep, the TV does a quick pass. But the TV or player doesn't do a lossy re-encode and go through the whole thing all over again. So in that respect I mean to say that leaving well enough alone seems the better choice.

    Hmm, I wonder if some of that hardware actually uses yadif?
    - My sister Ann's brother
    Quote Quote  
Visit our sponsor! Try DVDFab and backup Blu-rays!