+ Reply to Thread
Results 31 to 60 of 72
-
You're muddying the waters unnecessarily. The OP is talking about material shot on a DV camcorder, and how it plays back at apparently half the frame rate and with interlacing lines on a computer monitor. I think this is due to either no deinterlacing during playback, or the wrong type - and I stand by my subsequent advice.
(We all know interlaced video has its foibles, but in this case I'm convinced it's not an issue.) -
I would advise them to watch their interlaced videos on whatever they have that can handle different video formats properly. You don't need a CRT for interlaced video, and offhand I don't even know where one could buy a good one.
- My sister Ann's brother -
I answered the question twice, but you didn't answer mine. You just repeated yours like you didn't understand what I said. That's a funny image. Is that you? I'm betting you keep an old browned-out CRT around just for interlaced videos.
- My sister Ann's brother -
Well we seem to have a fundamental difference of opinion here. I think that if we deliver video to a modern device we should deinterlace it before we encode it, you seem to be of the opinion that it is the device's problem. Almost all video delivered though the internet is deinterlaced before being encoded, you would like to take a step back and change that?
It looks like we missed the chance again with H 265, that standard should simply have disallowed interlaced video, that would have put an end to this dreadful situation. -
-
Keep in mind that though some are better than others there is no perfect deinterlacer and there never will be. If you deinterlace your video before encoding you will be locking in the quality of the deinterlacer you use now. TVs (and other playback devices) will continue to improve on the quality of their deinterlacing. So some day your TV may do better than you can do now in software.
Also, if your playback device doesn't support 50 or 60 fps playback, so you deinterlace to 25 or 30 fps, you will lose half of the motion smoothness. That's pretty bad for shaky, handheld camcorder video. -
A good player or TV can deinterlace better than you can, even with QTGMC. Maybe you should trade in your bottom of the line Dynex and no-name BD player and get something that does it right. Would save you a lot of trouble. I'm not about to go through all that with the 4000-plus movies and tv shows in my collection, all of which display very nicely on my plasma, my LCD, and my old CRT in the bedroom, and my HTPC.
Yeah, I know. Nope. Most of it is low-bitrate or poorly processed crap that isn't worth a dime. I buy good retail copies of stuff I like and don't like paying for it more than once. I'm on cable and record a lot of it. The internet sucks for movies IMO.
Maybe you need to get your movement better organized. I'm certain the future will bring improvements, as well as more general lowering of original film quality for toys like iphones and mp3 compression. Meanwhile I'll shock the hell out of you and tell you I have 300 vinyl record albums and the equipment to play 'em the way they should be played, without all that CD contamination and digital rounding. And a premium CD player and an outboard audio DAC to make those pesky CD's behave. I think I have a decent representation of both old and new and the gear to handle it. When the future arrives, I'll be catching up like everyone else. But I won't ditch what still works. If new technology delivers an improvement, I'm all for it. If it doesn't, I won't ante up just because it's new. I'd rather do without. So far, it's all been working pretty well.Last edited by LMotlow; 15th Dec 2014 at 13:47.
- My sister Ann's brother -
Good point, I would obviously only deinterlace a delivery codec not an archive.
However one can deinterlace lossless.
I hope not as it would be a tremendous waste of money. Deinterlacing in the studio and deliver material deinterlaced would obviously be much more economic.Last edited by newpball; 15th Dec 2014 at 13:47.
-
Uh, so whatever happened to MiniDV_newb?
- My sister Ann's brother -
-
A brand new codec might decode "better" than the previous version -- fewer rounding errors, better interpolation, whatever. Most people don't throw away everything they own when a new product hits the shelves. Classic films and famous musical recordings will be around for a long time, and so will an audience for them. Better technology = better restorations. If you don't think Leopold Stokowski's brilliant 1945 recording of Beethoven's Third or Charlie Chaplin's City Lights are of any cultural importance for a great many people world-wide because they occurred a long time ago, you're missing out. Those events will take on greater clarity and power with better technology. So will somebody's 2001 family videos on Mini-DV.
Who says video will always be interlaced in the future? We're talking about now, which is what we have to work with. A lot of new formats aren't interlaced.Last edited by LMotlow; 15th Dec 2014 at 14:12.
- My sister Ann's brother -
Right, so H265 is not going to make a difference there! If a studio decides to reissue material in H265 it is most certainly not going to work on crummy old CRT devices. Or perhaps we should put the horse completely behind the carriage and mandate that H265 to RF converters be made available so that, right, we can still watch this on crummy old CRT devices?
I see, so you would like to deliver this in interlaced H265 format in the future? No, then why insist that H265 supports interlaced video?
Fully agreed, they will, but not with interlaced technology. -
I disagree with your idea that deinterlacing in the studio would be more economical: time is money. So if it takes extra time to deinterlace before encoding, it costs more money. Who's going to pay for that?
Also, there are literally 10s of thousands of programs from the 5 SD-based decades preceding, and 1 1/2 decades of HD to the very present which contain interlaced-sourced material (not talking about progressive, film-sourced stuff). Don't they get a shot at being stored in an "efficient" codec, too, without having to be fully reworked? It's about respecting history & honoring legacy material.
But if you are throwing out the old in such a cavalier way and enforcing the latest & greatest, why stop there? Shouldn't ALL footage be mandated to be converted to HFR, HDR, Rec2020 (or ACES), 32bit float per color channel, 4D?
If you recognize that as a extreme exaggeration, it was to prove a point: where do you draw the line? And who is drawing it? And why is that line the one chosen? Who has the right to choose for others?
These are the kinds of decisions of import that require slower, careful steps, not racehorse starts.
Nobody's stopping YOU from deinterlacing, nor from using HEVC! The rest of us don't all need to be in lock-step with that yet.
Scott -
Not only are you reading meanings that aren't in what I posted, but you're beating a dead horse. I'm sorry to hear that h265 still supports interlaced video. My old CRT will be used in the bedroom until it dies, it's a nice little TV with beautiful color, spent a lot of time in its service menu calibrating with a colorimeter. If h265 won't work on it, so be it, and some day the critter will bite the dust and nothing will work on it. I'll move on, like everyone else. Unlike many techno freaks nowadays, I pay attention to the content and the artistry, as long as it renders properly. I don't "watch" hardware, I watch what its playing. New tech has already hampered the film experience in movie theaters with that dried up digital conversion crap, so I have to take in film quality when I visit the MOMA in New York -- and they're going digital as well, crippling the original visual dynamics of so many old movies.
I guess you'll have to live with whatever the engineers decide about h265, and so will the rest of us. You can spend your time deinterlacing and recompressing videos, which does more damage IMO than playing the original properly with the right equipment. That's up to you. I prefer whenever I can to keep originals intact unless they're just poor work to begin with and need some fixup.
Have you seen anyone about this thing you have with interlaced video? Seriously, not trying to be funny. If you really want to get involved, you should be in depth studying the engineering behind it, not just sitting and watching it, if you feel that strongly about it.- My sister Ann's brother -
He's got a thing against MPEG2 and engineers also, it seems.
Scott -
I can understand that, even if I don't agree with it (not entirely). Have a young nephew who won't watch old movies because they're in black and white and aren't wide screen. He has no other reason, really. I'm trying to imagine Citizen Kane in 16:9 Technicolor, which makes me wait for the singing and dancing to start, or maybe a car chase. Just doesn't work for me. But that's the way some people are.
- My sister Ann's brother -
-
What are you talking about? Maybe we should decode them to lossless media and ship or broadcast them that way? You could re-engineer the bandwidth requirements for doing that and the hardware to work it out.
Isn't a lot of that stuff on pro-grade analog tape, never encoded?- My sister Ann's brother -
You are mistaken: much of that was not "lossy encoded". Lossy digital encoding didn't really appear in video masters until the 90's. Most of that previous is either ANALOG (2" Quad, 1"B, 1"C, 3/4"Umatic, M/MII, Betacam/SP, etc.), or D1 or D2. ALL of that is interlaced.
Scott -
Apart from the fact that that does not make any sense, it does make sense to decode and reencode as long as we not, heaven forbid, deinterlace it, because that would be disrespecting the sources? Am I the only one who sees holes in this logic? So reencoding delivery codecs is just fine but deinterlacing is one of the 7 deadly sins?
Perhaps purist should not deinterlace at all and watch interlacing artifacts on their progressive scan devices, can't get any closer to the original than that! Respect +1
Last edited by newpball; 15th Dec 2014 at 15:26.
-
That's what I thought. Likely the broadcast digital masters are of much higher quality than consumer material.
- My sister Ann's brother -
-
Depends on what else is done (or not) to it, but yeah, there's nothing inherently good or bad about MPEG2 or AVC or H265. Lots depends on the profile/level, subsampling, and on the bitrate. Just look at MPEG4 SStP SR Masters, considered by many to be near (if not AT) the top of the line of quality in compressed video, or possibly ALL video. And it supports interlaced.
Scott
...I seriously don't get what your gripe is with interlacing and with lossy compression. Sure, all things being equal, progressive is better, but very often all things are not equal. Same with lossy compression.
And there's lossy compression and then there's LOSSY compression. It's a spectrum. AT the top end of that spectrum, it is basically equivalent to lossless. At the bottom end, nothing you do is going to make it look good, deinterlacing or not.
But I see where you are going with the "yada, yada". That is very telling indeed.Last edited by Cornucopia; 15th Dec 2014 at 15:30.
-
My gripe with interlaced is that it gives tons of problems. Even on this forum about half of the reported questions/problems is about interlaced videos.
I do not have an issue with lossy compression at all, not sure why you conclude that. Obviously to deliver video you have to consider bandwidth and/or storage limitations and thus find a suitable compression format for delivery.
However for archiving purpose I think it is ludicrous to use lossy compression, there is absolutely no need. The price per GB on hard drives is now ridiculously low. People who want to go through great lengths in presumably preserving and improving those long lost moving images from their ancestors capturing it in some crappy delivery codec and complaining that the video takes more space than their Facebook selfies. It is just unfathomable to me. -
That's not true. There's no magic in TVs. Everything they do is either done in software or with hardware who's algorithms were first developed in software. I haven't seen anything on a TV as good as QTGMC (in general, obviously special cases exist). TVs don't have the luxury of spending a second deinterlacing each frame. They have to display 60 frames per second and can only perform minimal processing.
-
-
It only gives a ton of problems to people who don't understand or respect it. Very often, the best course of action WRT interlaced footage is to let it stay interlaced throughout, and let the player/display do the deint (if even necessary). Yet people are intent upon disregarding that axiom and plow through with editing, processing & compression without regard to its existence or maintenance. THAT'S when they have problems. But people also often come to this site because of goofy things they have done to their framerates (VFR) or to their subsampling, or via their re-re-recompressions. Precisely because they don't know better and aren't respecting the rules of video (and data, and engineering/physics).
Archiving is probably another tripartite-like constraint (much like Fast-Cheap-Good): There's [Quality]-[Longevity]-[Economy/Ease] (remembering that time/effort=money) and people choose their varying priorities. Yours may differ from some others'.
ScottLast edited by Cornucopia; 21st Dec 2014 at 00:26.
-
No, I'm not saying they're magic, and some are clearly not better at it than others (I had one of the "not better ones" and got rid of it, back to the store lickety-split). It would take at least magic to clean up some of the dirty low-bitrate junk thru cable that QTGMC can often deal with hands-down. Yep, the TV does a quick pass. But the TV or player doesn't do a lossy re-encode and go through the whole thing all over again. So in that respect I mean to say that leaving well enough alone seems the better choice.
Hmm, I wonder if some of that hardware actually uses yadif?- My sister Ann's brother