I have a DV video that I captured. I have the mainconcept DV codec on my computer. I frameserved it to CCE via AVISynth. I wanted to deinterlace it and change to framerate from 29.97 -> 23.976. This way it there would be fewer frames so the quality of each frame would be higher. I used the following script:
AVISource("source.avi")
ConvertToYUY2()
FieldDeinterlace()
ChangeFPS(23.976)
So after doing this it should be just like regular FILM that has been IVTC right? I loaded that script into CCE. I checked progressive frames when I encoded it. Afterwards I preformed a pulldown so it would be accepted on my DVD player. I burned in and now it looks fine on my PC but on my DVD player it looks jerky. I have encoded lots of DivX files to SVCD like this (except I didn't deinterlace and downsample the FPS because they were already like that) and they all turned out fine. Does anyone know how to fix this?
+ Reply to Thread
Results 1 to 7 of 7
-
-
Originally Posted by JIM E
You are going to have to live with 29.97fps if you are dealing with NTSC DV, period. SVCD is low bitrate, so many people prefer to just deinterlace since it requires less bitrate and is easier for the encoder to encode. You should definitely also use some form of noise reduction. The less light you have when you film, the grainier DV becomes, and you really must use noise reduction to blend those grains out, unless you have your filming VERY well lit. Other than that, there isn't much you can do. -
So I understand from what you said that IVTC FILM is 23.976 fps and in is progressive (no interlacing at all). I told you that I deinterlaced the video. So after doing that it would be progressive just like FILM. And after I turn the FPS down to 23.976, the FPS would be the same as FILM. So I still don't see how it is any different than FILM that has been IVTC. It has the same FPS and it is pregressive.
-
The difference is that the term FILM denotes how the material was shot. FILM is shot progressively at 24fps. All of the information needed to present all the motion captured is contained in these 24 frames per second. To get the 29.97fps output it is split into fields and certain fields are repeated in a specific pattern. These fields make up new frames which are extraneous. They basically fill in gaps within the original frames. These little moments in time never actually occurred, they are created to increase the fps simply because of the NTSC framerate requirement. To IVTC the material you can just remove these newly created frames and retain the original 24fps as it was shot. Again, all the information you need is still there.
DV is totally different. It is what is called pure NTSC, and this refers to how the footage was SHOT, not just what its fps or interlaced/progressive nature is at any given time in your editing/encoding process. As you record, you are taking 29.97 samples every second, and they happen to be broken up into two fields each time. You cannot throw out any of these samples, to do so would mean you lost that moment in time and the resulting movement would be jerky.
You could convert to FILM any number of ways. You could make it progressive and remove or add frames through various methods. But just think about it, it doesn't matter what fps it is stored at. If you are just throwing out random frames then of course the movement is going to be jerky. Yes your encode is now FILM, because it is 23.976fps progressive. But since your source never origated as FILM, you have thrown out needed data in getting to this format and that's why the output is jerky.
There is simply no way to IVTC NTSC DV footage without making the output jerky. The only option is to buy an NTSC camcorder which films in true FILM (24fps progressive) and these are very expensive. -
All right. I've accepted the fact that I cannot convert my captures to 23.976 FPS. So I encoded as it is when I first comes out of the camera. I used CCE again. I checked interlaced this time. Then I burned it to a SVCD. I've run into another problem. It is fine on my PC, but the quality on my TV is really bad. Before, whenever I made SVCD's using video that was previously DivX, it looked just as good on my TV as it did on my PC. I thought maybe I got the field order wrong so I switched it. When I did that it was just the same only worse because now there were interlace lines. Is this a side effect of pure NTSC? Is there anything I can do to fix this?
-
Can you be more specific regarding why it looks bad? If you aren't using noise reduction than you are not going to be happy with the output quality, plain and simple. DV is always going to be noisy, you've got to counteract that. As far as field order, you can adjust that in CCE or through your avisynth script. The latter is actually a better way.
Follow this brief guide FAQ. Its aimed at DVD encoding, but its pretty much the same for SVCDs. Just ignore the part about AC3 encoding and don't check the dvd compliant box and it should be applicable to you.
http://forum.doom9.org/showthread.php?s=&threadid=60392
Similar Threads
-
Image jerky after converting a DVD to MKV
By siocnarf in forum Video ConversionReplies: 2Last Post: 22nd Apr 2012, 14:14 -
Video is jerky
By arnie22112000 in forum Newbie / General discussionsReplies: 5Last Post: 22nd Apr 2008, 00:49 -
Problems Converting AVI to SVCD
By wiseguy109 in forum Video ConversionReplies: 3Last Post: 6th Oct 2007, 14:17 -
error encoding while converting avi to svcd
By joannekhow in forum ffmpegX general discussionReplies: 4Last Post: 1st Oct 2007, 11:41 -
Burning SVCD to a DVD (Not converting!)
By bebopdeluxe in forum Newbie / General discussionsReplies: 4Last Post: 2nd Aug 2007, 09:06