ok..i use pal camcorder, then i capture movie with pal settings, then after i edit, i output to full avi uncompressed video. but the problem is... the video makes me dizzi(not smooth). i dont know how to call it, but even slow panning of the video will also look like this.
erm i dont know anyone understands me or not, but for example, when i shoot videos with my camcorder, i pan my camcorder quickly, that will make viewers feel dizzi. but in this case it is very worst. its like when u play a 3d game at 25 fps. hope u guys can help
+ Reply to Thread
Results 1 to 11 of 11
-
-
You are watching the AVI files, right? After you convert it to DVD and play on your TV it will be ok, or at least better. I did the same mistake on the most of my older DV footage, moving the cam too fast and zooming to fast is BAD, typical beginners errors. The reason you get "dizzy" is because of the interlacing.
-
You might be talking about "Field Order". If you capture/convert using the wrong field order when you play the files on your TV set they will be jerky expecially during panning shots.
Check which file order you have been using. Take a jerky section of video and 'reprocess' it in the opposite field order, create a DVDRW containing those two 'labeled' clips and try them on your TV (1 clip in field order A and one clip in field order B). I did this test long ago myself.
In all my video programs I choose Field Order A except when working in Ulead products then I choose Field Order B (for some strange reason I can't figure out.... lol). Do the test and if your problem is field order you will know which one to use.
Hope this helps.
Good luck.
{EDIT}
Re-reading your post I now see my field order theory is not what's wrong.. Sorry.
Check your CPU usage (performance) when playing and capturing your video files. If your CPU usage is at a high level it will jerk like you are seeing. Either that or hard drive performance, but I would bet on the CPU being loaded up.
Good luck. -
erm...when i watch the original captured video which i captured with IEEE, the video is very good and even in fast panning shots! but when i edit them (with premiere) and output to avi (even in uncompressed full frames!!) , this problem starts to come out.
btw, i used adobe premiere standard 48khz pal to capture, i wanna know how many fps will the video be captured? coz i was wondering why the original captured video is so smooth but the edited one is so . . . makes me dizzi.
and yes i hope when i convert it to dvd n watch on tv it will be good. thanks -
Originally Posted by houseng
-
Originally Posted by thor300
-
I doubt the editor progg would change the framerate, but never know... Anyway, if you output to DV it should be able to use a stream copy, so you would not get any quality loss in the video except for the parts where you edited/added transitions and so on. Btw, You can check the framerate of the edited stuff in virtualdub (File -> File Information)
-
ok....here is the sample video..
http://www.geocities.com/jdhouseng/gongong.avi
as u can see in the video, it will make u go dizzi. try to view it in full screen. -
ok....here is the sample video..
http://www.geocities.com/jdhouseng/gongong.avi
as u can see in the video, it will make u go dizzi. try to view it in full screen. -
Let me be a little more explicit:
The problem you are having is that your camcorder is giving you interlaced video. Since you are running a PAL system you get 25 frames per second. But video is interaced -- what you really get is 50 fields per second. Each field consists of half the picture. One field has all the odd numbered scanlines, the other has the even numbered scanlines (ie, every other line of the full image). The problem with a camcorder is, those two fields were not taken at the same time, the second one was taken 1/50 of a second after the first. So anything that was moving (or camera motion) is not in the same place in the two fields.
When you display a full frame (both fields) at full resolution on a computer monitor you see annoying interlace artifacts that look like this crop:
Some programs will display only one field on the computer screen (especially while capturing) because both fields together look awful.
On a TV screen the interlaced fields aren't a problem because you don't see both fields at the same time. By the time the second field is drawn to the screen, the first field has faded away.
Your reduced size sample AVI simply took pairs of scanlines (one from each field) and averaged them together, giving something of a double exposure. Hence the fuzzy image during camera motion.
If your final output is going to be a full sized (720x576) image on DVD, just leave the video interlaced. It may look bad on your computer monitor but it will look fine on a TV. Unfortunately, there is no single standard for field order (which is first, which is second) in the industry. Not even in the nomenclature. Programs refer to field order A vs B, 1 vs 2, odd vs even, top vs bottom. You may have to experiment a bit to figure out which works for your combination of software. That said, "Field Order B" is usually the correct choice for DV and DVD.
If your final output is for a computer, then use a deinterlace filter. There is no PERFECT deinterlacing method. There can't be. You can't take two pictures that were taken at different times and produce a single picture with all the details of both. But there are many methods that are used with differing levels of success.