Hmm. I've been going through the trouble of deinterlacing 60i home videos that I captured via DV, so that eventually I can integrate the results into a 720p60 or 1080p60 home video of sorts.
Footage straight from minidv tape has gone well. The deinterlacer spits out 60fps video that I can use.
Footage captured as DV but which was originally on Hi8 tape.. has given me some problems. Specifically, when I scrub through the footage frame by frame, it is visibly bobbing up and down. In the 60fps timeline, one frame is up a bit, the next frame is down a bit, next is up, next down, etc. In full motion, it's perceptible enough to be distracting.
Now, the key difference here between the footage which worked fine and the footage which is giving me some problems is the fact that the problem footage was originally from analog (Hi8) tape, even though the file itself is a standard DV file like the others. The footage on Hi8 was interlaced, just as DV is. Here's where my knowledge breaks down into guesswork. Hi8 possibly uses a "resolution" which differs from 480i. However, fields in a given frame are treated basically the same in any format: half of the frame is one field, and half is the other. Top and bottom. If the difference in resolution was really such a big concern, then these videos wouldn't look watchable in a media player, yet they certainly are watchable, with no unusual phenomena such as the bobbing effect I'm getting by deinterlacing it.
So I guess my hope is that whatever problem I'm encountering is either commonplace or easily intuitable by the knowledgable folks in these forums, and a possible solution can be recommended. Thanks for reading.
+ Reply to Thread
Results 1 to 13 of 13
-
-
Update: Bizarrely, if I manually adjust the video's y position up and down 1 pixel every frame (in the 60fps timeline), the bobbing effect vanishes. Very curious.
As an interim fix, can anyone recommend an After Effects "expression" script which will adjust the layer's position up and down one pixel over time? I'm a bit lost when it comes to those things but I figure it's possible. -
That why it's called bob.Originally Posted by Colmino
The number of scanlines is of course identical. But there might be a difference in sharpness of the underlying image. Very sharp horizontal lines or edges will appear to bounce much more than fuzzy edges. Sharp edges like this will appear to bounce (or flicker) even when you watch the original Hi8 tapes on a CRT. You've probably seem some locally produced ad with bright text that flickered a lot on CRT TVs.Originally Posted by Colmino
One solution is to use a smarter bob. AviSynth has a some very good smart bob filters. Yadif is probably the best. These will look at several fields and simply weave the portions where there is no motion.Originally Posted by Colmino -
Thanks for the recommendation.
As it happens, I was able to develop a workaround to force the position of the video (in AE) to counter-bob based on where it was in the timeline. This caused the bobbing effect to vanish. I suppose I should be concerned that I have to do this with Hi8-sourced DV but not with minidv-sourced DV - it's distressingly counterintuitive - but the results work, and that was ultimately faster than probably any other solution that might develop. Blah. -
I don't understand why the MiniDV and Hi8 captures differ. How did you cap the Hi8 to DV format?
Did you identify your editing software? What is the processing chain?
I can understand a 720p target but what do you plan to do with 1080p? Why not 1080i?Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
For the Hi8, I used a Sony DCR-TRV340 Digital8 camcorder, which I bought secondhand solely for the purpose of digitally archiving the ~30 Hi8 tapes my family had recorded. It also served in-between duty for the bare handful of VHS tapes I wanted to archive, as it had S-Video in, and generated less noise than the more impressive Panasonic NV-GS100 minidv camcorder I also had access to.Originally Posted by edDV
After Effects 6, in this case. DV video (29.97fps interlaced) in a 59.94fps composition. Deinterlacer only.Originally Posted by edDV
Some elements of my home video project will be generated by After Effects, and therefore have the potential of being whatever resolution I desire. Things like titles, opening animation, etc. I've already completed one such animation in 1080p60. 1080i is interlaced, and is therefore potentially (I would say inevitably) subject to the sort of artifacting you get when the need ultimately arises to deinterlace it for displaying purposes. So, in short, the 1920x1080 resolution matches the display device which will be utilized to show the video, and the progressive scan is essentially my idealized format. If I can get a PC to play the thing reliably without skipping frames, fine. That may involve a Linux install or something, as I have never, ever gotten any Windows box to provide a reliably stable framerate, under any circumstance. Otherwise, yes, the project will have to be 1080i60, and I will use either an Xbox 360 or a PS3 to play it.Originally Posted by edDV -
OK unchartered territory.
I'd suggest you edit some captured Hi8 source with MiniDV in DV 480i to verify the problem wasn't in the capture*. Any modern SD or HD TV can handle 480i or 1080i. Computer playback requires realtime deinterlace which can be problematic. 1080i source provides more PC display options.
*This also gives you a base of comparison for your upscale "improvements". In this case your opponent is the internal hardware TV display processor.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
It's true that any contemporary display can handle 1080i, but it's also a fact that deinterlaced footage is never as good as the non-interlaced original. I don't have concerns over whether or not displays can handle 1080p60. If anything, the facts that 1) the quality of deinterlacing varies from device to device, and 2) the process also happens to introduce video latency.. these weigh more heavily than any thoughts over 1080p60 compliance.
-
But Hi8 and most MiniDV source was never progressive to start. Also most of it is shaky hand held material that is difficult to deinterlace. I do agree that "bobbing" the fields to frames is the best way to go about it. Once bobbed, upscale is easier to do but flicker errors often result. A good HDTV will handle the upscale from 480i to 1080p in hardware without flicker. Usually a good hardware deinterlacer will weave stationary backgrounds while bobbing objects in motion.Originally Posted by ColminoRecommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
You might want to re-read your last post - it doesn't actually make sense. A) Deinterlaced footage and non-interlaced footage are the same, and b) your source is interlaced. Finally, while deinterlacers may vary from TV to TV, they are invariably, unless you have the cheapest no-name monitor dressed as a TV, better than any software method. I have yet to see any evidence of so-called video-latency from any reputable equipment, given interlaced footage has been with us since the dawn of television.
Read my blog here.
-
Sorry to differ, but I can't agree that if you take a 1080p render, convert it to 1080i, and then pass that through a deinterlacer, the result will be 100% identical to what you started out with. Were this the case, there would never have been the need for complicated per-pixel motion-adaptive deinterlacing, which even then is still quite imperfect.Originally Posted by guns1inger
You'll have to trust me on this one. There are software solutions which can be better. True, if you have, say, a Gennum VXP somewhere in the mix, you can probably rest easy. But I'm going to speculate that most people are dealing with Realta or worse.Originally Posted by guns1inger
I got the Algolith guys (makers of the best deinterlacing hardware currently available) to reveal that their hardware (specifically the HDMI Flea) introduces roughly one-third of a frame of latency, or about 11ms. And theirs are also among the fastest, despite being the best. It's there, and it's decidedly absent when the footage is progressive. Algolith's products are reputable.Originally Posted by guns1inger -
I got the Algolith guys (makers of the best deinterlacing hardware currently available) to reveal that their hardware (specifically the HDMI Flea) introduces roughly one-third of a frame of latency, or about 11ms. And theirs are also among the fastest, despite being the best. It's there, and it's decidedly absent when the footage is progressive. Algolith's products are reputable.[/quote]Originally Posted by guns1inger
I don't understand your issue with latency. So what if the video delays a third to a full frame on playback display so long as audio is also delayed a similar amount. Latency issues are extreme through cable boxes vs. direct tuned analog stations amounting to seconds. The only issue is audio echo if you can hear both.
Software encoding has even greater latency when encoding real time as set by the buffer size. Real time software encoding is always a compromise.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about
Similar Threads
-
480p60 MPEG-2 to Blu-Ray without re-encoding?
By miamicanes in forum Authoring (Blu-ray)Replies: 1Last Post: 11th Dec 2010, 20:09 -
Sony Hi8 has been defective but want to play the Hi8 tape
By coody in forum Camcorders (DV/HDV/AVCHD/HD)Replies: 16Last Post: 22nd May 2009, 23:06 -
Hi8 to DVD
By 21Wiz in forum Newbie / General discussionsReplies: 3Last Post: 2nd Oct 2008, 02:25 -
Hi8 vcr, better playback than a Hi8 camcorder?
By Knightmessenger in forum Capturing and VCRReplies: 8Last Post: 22nd Dec 2007, 02:11 -
Will cheap new Hi8 camcorder output better than old, worn Hi8 camcorder?
By ErikL7 in forum RestorationReplies: 7Last Post: 2nd Sep 2007, 07:26



Quote