I have an interlaced sequence on my DVD that looks perfect on a TV DVD player, but on my computer (which is of course progressive scan instead of interlace) it looks blurry. Is there anything I can do about that?
If it is helpful to solving the problem here is some history about what I did: The sequence is an animation that I rendered out with interlaced frames because of an extreme vertical camera move (If I don't interlace, it will strobe like crazy on a TV). I then encoded the image sequence with TMPGEnc (BTW It does not seem to make a difference whether I encode it with the "interlace" or "non-interlace" options in the settings) then authored in DVD-lab and burnt with Nero.
thanks!
+ Reply to Thread
Results 1 to 30 of 63
-
-
Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
-
Can you post a few frames illustrating the problem?
Extreme motion probably breaks any deinterlacing algorithm other than viewing half vertical resolution fields during that motion. That is how high end motion adaptive deinterlacers work. On a pixel block basis, they will switch to single fields during high motion. The eye doesn't see this because detail perception is reduced during motion.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Also: Televisions normally use sharpening filters because they are designed to display fuzzy analog signals. Computer monitors normally receive a crystal clear signal so they don't sharpen the image.
-
This is the result of the software doing a deinterlace, so that you can watch it on a progressive monitor..
The byproduct of deinterlacing, is softening of image
Can you post a few frames illustrating the problem?
Here is an approximation of what I am seeing. I created this using a deinterlace set to "blur"
here a swatch from the original interlaced frame
here is what that frame looks like when it is deinterlaced (for instance in photoshop or any composing software) using the interpolation method. This is what I want.
Again my question is not so much why this is happening, but more importantly how can I fix it?
thanks -
You can't fix it.
Want my help? Ask here! (not via PM!)
FAQs: Best Blank Discs • Best TBCs • Best VCRs for capture • Restore VHS -
Originally Posted by junkmalle
How do I do that? -
Hi-
bob" deinterlace?
How do I do that?
In my old 4.0 version of PowerDVD you right-click the video screen and go Configure->Video->Advanced->Force Bob.
However, bobbing the video (interpolation, using your term earlier) has problems of its own, including the loss of half your resolution, aliasing and sometimes horrible shimmer as a result. -
Are you watching this on an LCD monitor? Could be the raster doesn't become extinct quickly enough, so you'd end up with Ghosting/MotionBlur.
Try same DVD with same Player software on someone else's computer. Does it do it then? (Esp. try CRT display)
Scott -
Originally Posted by Cornucopia
-
Originally Posted by manono
Any idea what the "interlace" and "non-interlace" options in TMPGEnc actaully do? I couldn't see any difference... -
Originally Posted by sharktacosRecommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by edDV
Before I interlaced it it would play correctly on a PC and not on a TV.
I would like to play correctly on both, like any film at blockbuster would. -
Originally Posted by sharktacos
Those movies you refer to are likely progressive video scanned from film. You don't have film, or any other progressive source. You have interlaced source off VHS or tv.
All you can do is deinterlace (or force an IVTC) which will degrade the video quality shown on the tv set.Want my help? Ask here! (not via PM!)
FAQs: Best Blank Discs • Best TBCs • Best VCRs for capture • Restore VHS -
If your in an NTSC country your best solution is to render your source as progressive at 23.976 fps and mark it for 3:2 pulldown on playback. That will play properly on both a computer and TV. This is the way most commercial movie DVDs work. Adding motion blur to your renderings will help.
There is no way to convert interlaced video to progressive without introducing artifacts or bluring. -
Originally Posted by sharktacos
https://www.videohelp.com/forum/viewtopic.php?t=284614
If you try deinterlace filters you will compromise playback on both targets. IVTC restores original progressive video.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
[quote]Those movies you refer to are likely progressive video scanned from film. You don't have film, or any other progressive source. You have interlaced source off VHS or tv.
Actually I don't. I'm not copying something from TV or filming with a DV cam. I have an original source CGI animation I made that I can render out any way I like. At first I had rendered it out as progressive scan 30 FPS and then when it got strobing on TVs I rendered out that part as interlaced.
Even if I did render it out at 24 FPS and did a pull down for video like they do for films, I dont really see how that would solve my problem here. 24 FPS is less information that 30 FPS. -
Originally Posted by sharktacos
-
[quote="sharktacos"]
Those movies you refer to are likely progressive video scanned from film. You don't have film, or any other progressive source. You have interlaced source off VHS or tv.
Actually I don't. I'm not copying something from TV or filming with a DV cam. I have an original source CGI animation I made that I can render out any way I like. At first I had rendered it out as progressive scan 30 FPS and then when it got strobing on TVs I rendered out that part as interlaced.
Even if I did render it out at 24 FPS and did a pull down for video like they do for films, I dont really see how that would solve my problem here. 24 FPS is less information that 30 FPS.
29.97 fps interlaced rendering can also be used and this will match normal NTSC TV broadcast, a DV camcorder or other NTSC sources for editing. The resulting DVD will be interlaced and it will need a deinterlacing player for computer playback.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Rendering at 23.976 fps or 29.97 fps with motion blur is the best solution.
Maybe I need to be more specific: My film is rendered at 30fps with motion blur (progressive non-interlaced frames). This works for almost everything. But in one scene I have a horizontal camera pan which creates movement which is too slow to be effected by motion blur, but too fast for in interlaced TV. So on a TV set I get this awful strobing. I tried adding in more motion blur but since it was an artificially high amount it simply made everything look blurry.
I then interlaced this section, and it looked perfect on a TV (remember every other part has always looked fine and was and is 30fps progressive scan with motion blur. This new section is now 30fps interlaced and with motion blur (and yes I applied the motion blur before interlacing). This one section looks now blurry on a PC.
I can do a test and render out that problematic section at 24fps and then do a pull-down in the encoder to 30fps, but I predict that it will just then strobe on a TV as it did before when it was 30fps progressive scan in the first place. -
Putting aside motion blur for the moment, NTSC video supports temporal resolutions of 23.976 or 59.94 time (motion) divisions per second.
23.976 progressive can be played to an interlaced TV at 59.94 fields per second using the "pull down" field repeat process giving 29.97 frames per second equivalent frame rate. However the inherent motion resolution is still 23.976.
Interlaced video acquires motion in 59.94 time increments per second with alternating fields. A 30 fps progressive cgi render will not achieve 59.94 time increments since both fields derive from the same frame time sample. So effective motion sampling will be 29.97 frames per second.
ATSC (DTV) allows more flexibility. Cameras or CGI renders can operate at 59.94 frames per second for fine motion detail. 59.94 interlace, 29.97 progressive and 23.976 progressive are also supported.
Does this help?Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
[quote="sharktacos"]
My film is rendered at 30fps with motion blur (progressive non-interlaced frames). This works for almost everything. But in one scene I have a horizontal camera pan which creates movement which is too slow to be effected by motion blur, but too fast for in interlaced TV. So on a TV set I get this awful strobing.
One place I've seen something that might be called strobing is when someone takes a telecined movie and runs a blend deinterlace. Since two of every five frames is a blend of two separate pictures you can see a strobing effect, six times a second when things move at a moderate rate. But if your rendering 30 progressive frames a second that can't be the problem. -
Originally Posted by junkmalle
As far as imagining what I mean by "strobing" imagine if instead of a sequence playing a new picture every frame (smooth motion), imagine if it were instead a new picture every 10 frames (choppy motion) and you would get the strobing I mean. The images seem to flicker.
As I said as soon as I interlace it, thus adding in more information, it goes away on a TV since interlace means that instead of just having a frame at 30fps it makes a field at frame 1 then another field and frame 1.5 and so on, thus having effectively double the information. -
Originally Posted by sharktacos
Originally Posted by sharktacos
Originally Posted by sharktacos -
Originally Posted by junkmalle
But maybe your DVD player just doesn't like 30 fps progressive?
Trust me, it is because of the extreme horizontal movement in the camera pan on this one shot that it strobes. Every other shot looks great with 30 fps progressive.
Try taking your progressive 30 fps source and telling the MPEG2 encoder it's interlaced. See if it likes that any better. -
Since you have CRT it *may* be possible to display your movie interlaced. I'm not sure is it common feature, I have never investigated this subject before, and myself actually I've seen only 2 monitors like this in my life (1 of them was my very old NEC 15''), but perhaps your monitor does interlaced picture at 60Hz at some resolutions?
-
Have you actually determined what the source framerate is? Doesn't sound like it to me. It seems to me that you're just flailing around in the dark. If the true framerate is really 24fps and you're converting it to 30fps by encoding it as interlaced already telecined, then that could very well be the problem. I think that edDV had it right a long time ago. If it's really a hybrid, with a mix of 24fps and interlaced 30fps, then you have additional problems, but without seeing an unprocessed sample of the source, it's hard to tell.
I don't even know why you're messing with motion blur at all.
24 FPS is less information that 30 FPS.
Not if 24fps is the true framerate. Adding fields and frames to that to convert it to 30fps could definitely be the source of your problems.
Unfortunately, since it only appears on an interlaced TV I'm not sure how I could do that since any movie I would post which you could watch on your computer wont have the problem.
Again, if you could post a portion of the unprocessed source, we could figure out the best way to treat it. VOB or M2V would be good.
Similar Threads
-
interlace help
By nymph4444 in forum Newbie / General discussionsReplies: 339Last Post: 17th Jun 2011, 02:33 -
Question about HD to SD and interlace>progressive>interlace
By ayim in forum Video ConversionReplies: 4Last Post: 10th Dec 2009, 12:21 -
VirtualDub De-interlace
By Lee82 in forum EditingReplies: 16Last Post: 25th Feb 2008, 18:57 -
Interlace confusion
By Bagshot in forum Capturing and VCRReplies: 7Last Post: 16th Feb 2008, 08:46 -
Interlace or Progressive??
By Browncoat in forum Newbie / General discussionsReplies: 9Last Post: 31st Jan 2008, 15:14