VideoHelp Forum
+ Reply to Thread
Results 1 to 19 of 19
Thread
  1. I have a football video and would like to make it slow motion to 1/2 and 1/3 speed, then convert it to DVD. I use AssumeFPS but the resulting DVD video has a lot of artifacts. I want something like this youtube video. https://www.youtube.com/watch?v=_CSqZpNrxls

    Here is a link to the football video. https://mega.nz/#!35FABSiI!UZaXoxrkLmGkdOai4dv2dAZyADlmFKJdxUP4Wr1urqc
    Quote Quote  
  2. You might find some of the notes in my guide HERE useful
    Quote Quote  
  3. AviSynth SmoothFPS2:

    https://forum.videohelp.com/threads/335908-How-can-I-get-the-smoothest-slow-motion-out-...=1#post2085931

    But motion interpolation techniques won't work well with large or complex motions.

    Code:
    AviSource("Football-sample.avi") 
    ConvertToYV12(interlaced=true)
    QTGMC(preset="fast")
    SRestore(frate=20000.0/1001) # remove blended and duplicate frames
    AssumeFPS(framerate/3.0) # slow to 1/3
    SmoothFPS2(20000,1001) # speed back up with interpolated frames
    There's still some duplicates after SRestore so there's some jerks in the video. You'll have to fine tune the script to get rid of those. But you can see the technique doesn't work well with this video. I wouldn't bother.

    Replacing SmoothFPS2(20000,1001) with ConvertFPS(framerate*3.0) will give the blending effect seen in the youtube video.
    Image Attached Files
    Last edited by jagabo; 30th Jan 2016 at 12:26.
    Quote Quote  
  4. Both works and gives similar results with some artifacts. It'll have to do. Thanks for you help.
    Quote Quote  
  5. I created a video comparison between the three techniques that I uploaded years ago to my YouTube channel, but the one you found is better.

    I just looked at your video, and you are going to have problems creating slow motion, no matter which approach you use. The reason is that the film transfer contains blended fields, and also contains pulldown (duplicated fields). These all must be removed before you create any slow motion, or you will end up with a huge mess. This is because the motion estimation will get VERY confused when it tries to match a blended frame to the adjacent frames, and can't find anything that really matches.

    Using TIVTC with the following code works pretty well to remove the blends and dupes, although only to a first approximation. SRestore might be a better tool, or at least spending more time tuning the tfm and tdecimate settings.

    Code:
    tfm(display=false)
    source=tdecimate(cycleR=11,cycle=30)
    One very important addition to the MVTools2 and SmoothFPS scripts provided above (SmoothFPS is a GPU implementation of MVTools2) is that you definitely want to experiment with different block sizes. I've done a lot of slow-mo with both of these tools, and while there are some situations that are ALWAYS going to produce artifacts, you can sometimes minimize these by choose different block sizes. I don't remember what the default block size is with SmoothFPS, but the block size used in the MVTools2 script provided above is 8. You can change that to 4, 8, 16, or 32. You can also change the overlap. More overlap sometimes produces better results, but not always. However, more overlap always increases rendering time.

    The starting point in my various slow-mo scripts is a block size of 16 and an overlap of 4. I am in no way saying that this is "better," only that you may get slightly fewer artifacts. Try it out and see what happens. See my comparisons below.

    Finally, you will are never going to get really good slow motion from this source material because it is such low frame rate. As you will see in my scripts below, after I recovered the original frames, I set the playback speed to 15 fps. This appears to be approximately the correct frame rate for this film. However, with so few frames per second, the temporal gap between frames is HUGE, and as a result, things move by a gigantic distance between one frame and the next. Even the smartest software in the world isn't going to be able to correctly track all those pixels and figure out where they should be in the intermediate frame(s).

    Here is the code I used:

    Code:
    loadplugin("C:\Program Files\AviSynth 2.5\plugins\MVTools\mvtools2.dll") 
    loadPlugin("c:\Program Files\AviSynth 2.5\plugins\TIVTC.dll")
    
    source=AVISource("E:\fs.avi").Killaudio()
    converttoyv12(source,interlaced=true)  #Use this if incoming video is RGB
    assumebff()
    tfm(display=false)
    film=tdecimate(cycleR=11,cycle=30).assumefps(15) 
    
    super=MSuper(film,pel=2)
    backward_vec = MAnalyse(super,blksize=16, overlap=4, isb = true,  search=3 )
    forward_vec =  MAnalyse(super,blksize=16, overlap=4, isb = false, search=3 )
    MFlowFps(film,super,backward_vec, forward_vec, num=30, den=1, ml=500, thscd1=800)
    
    #return film  #Used to test pulldown removal and also judge proper film speed
    Here is that same code, modified to use the smaller blocksize that is used in the MVTools2 code linked to above:

    Code:
    loadplugin("C:\Program Files\AviSynth 2.5\plugins\MVTools\mvtools2.dll") 
    loadPlugin("c:\Program Files\AviSynth 2.5\plugins\TIVTC.dll")
    
    source=AVISource("E:\fs.avi").Killaudio()
    converttoyv12(source,interlaced=true)  #Use this if incoming video is RGB
    assumebff()
    tfm(display=false)
    film=tdecimate(cycleR=11,cycle=30).assumefps(15) 
    
    #Using smaller block size (adapted from code linked to above)
    super = MSuper(film, pel=4)
    backward_vec = MAnalyse(super, overlap=4, isb = true, blksize=8, truemotion=true, search=3)
    forward_vec = MAnalyse(super, overlap=4, isb = false, blksize=8, truemotion=true, search=3)
    MFlowFPS(film, super, backward_vec, forward_vec, num=30, den=1, ml=500, thscd1=800)
    
    #return film  #Used to test pulldown removal and also judge proper film speed
    Here is a comparison of one frame of the interpolated footage, the first one using the blocksize of 16, and the other the smaller blocksize of 8:

    Blocksize=16



    Blocksize=8


    In this particular example, the larger blocksize did indeed produce a nicer result (IMHO).
    Quote Quote  
  6. By the way, the SmoothFPS2 I linked to is not GPU accelerated -- it uses MvTools. I think you're thinking of SVPFlow. But yes, playing with block sizes and other parameters can sometimes give better results.
    Quote Quote  
  7. Originally Posted by jagabo View Post
    By the way, the SmoothFPS2 I linked to is not GPU accelerated -- it uses MvTools. I think you're thinking of SVPFlow. But yes, playing with block sizes and other parameters can sometimes give better results.
    Yes, you are completely correct. I screwed up there, and I WAS thinking of SVPFlow. SmoothFPS2 is simple a function, built around MVTools2, to give it a friendlier interface.

    In my defense, however, I do have a version of SmoothFPS2 which uses SVPFlow instead of MVTools2 (I'm not sure where I got this from):

    Code:
    function SmoothFPS2(clip source, threads) { 
      super_params="{pel:2,gpu:1}"
      analyse_params="""{
            block:{w:16,h:16}, 
    	main:{search:{coarse:{distance:-10}}},
    	refine:[{thsad:200}]
            }"""
      smoothfps_params="{rate:{num:30,den:15,abs:false},scene:{mode:0,limits:{scene:8500}},algo:21,cubic:1}"
      threads = 5
    
      super =   SVSuper(source,super_params)
      vectors = SVAnalyse(super, analyse_params)
      SVSmoothFps(source,super,  vectors,  smoothfps_params,  url="www.svp-team.com",  mt=threads)
    
    # Alternative for interlaced output
    # SVSmoothFps(source,super, vectors, smoothfps_params, url="www.svp-team.com", mt=threads).SeparateFields().SelectEvery(4, 0, 3).Weave().assumefps(29.97)
    }
    Last edited by johnmeyer; 1st Feb 2016 at 16:11. Reason: typo
    Quote Quote  
  8. @johnmeyer Your script produces the least amount of artifacts. Thank you.
    Quote Quote  
  9. You're welcome. Please experiment with overlap, and also consider spending a little time learning about TFM and TDecimate so you can do a better job of removing duplicates and blended frames, but without also remove good frames. My code gets you pretty close, but there is definitely room for improvement. I just didn't have the time to go any further.
    Quote Quote  
  10. johnmeyer, could you run a quick test for me? I can't use the GPU version of your SmoothFPS2 so I modified the script to use the CPU and named it SmoothFPS_SVPFlow. But it fails to recognize a lot of motion. Attached is a simple script and image that moves a ball around the screen (in a circle) then calls the modified SVPFlow change the frame rate from 15 fps to 60 fps. The ball gets stuck at some positions with SVPFlow. Do you get the same thing when you use the FPU? I included my version of SmoothFPS2 that uses MVTools2 for comparison.
    Image Attached Files
    Quote Quote  
  11. @jagabo, yes you do get frame duplicates with those settings, GPU or CPU ; but if you change the settings it will give similar results to vanilla mvtools2 scripts. In my experience the threads setting can also give weirdness in the results, so I typically tweak with single thread then check if threads>1 yields stable results later

    I didn't spend time playing with it too much , but for the CPU only this should work , for the GPU version just change gpu:1 and cubic:1 .

    Code:
    function SmoothFPS2_modified(clip source, threads) { 
      super_params="{pel:2,gpu:0,gpuid:12}"
      analyse_params="""{
            block:{w:16,h:16,overlap:2}, 
    	main:{search:{distance:0}},
    	refine:[{thsad:200}]
            }"""
      smoothfps_params="{rate:{num:4,den:1,abs:false},scene:{mode:0,limits:{scene:8500}},algo:21,cubic:0}"
      threads = 1
    
      super =   SVSuper(source,super_params)
      vectors = SVAnalyse(super, analyse_params)
      SVSmoothFps(source,super,  vectors,  smoothfps_params,  url="www.svp-team.com",  mt=threads)
      }
    Quote Quote  
  12. I'm not at my editing computer and may not get to it for a few days.

    Many motion estimation breakdowns are caused by scene detection settings (thscd1 and thscd2 in MVTools2). Try changing those in your SVPFlow version.

    Multi-threaded can sometimes do strange things. MVTools2 is a pretty solid MT citizen, and seldom causes a problem. I have used SVPFlow so little that I really don't know much about its limitations and issues. By contrast, I use MVTools2 almost every day. So another thing to try is to turn off all multi-threading for SVPFlow and see if that makes a difference.

    Several years ago I have a problem with MVTools2 which eventually forced me to try using SVPFlow. I posted over in doom9.org and in the discussion which followed, I found out several things that may help you. If you don't have time to read everything, start on the second of the two pages. You'll find a post by me where I complain about SVPFlow appearing to be broken. If you follow from that point forward, you will see that I received some help which may also help you. Here's the thread:

    SVSmoothFps appears broken
    Quote Quote  
  13. Originally Posted by poisondeathray View Post
    @jagabo, yes you do get frame duplicates with those settings, GPU or CPU ; but if you change the settings it will give similar results to vanilla mvtools2 scripts. In my experience the threads setting can also give weirdness in the results, so I typically tweak with single thread then check if threads>1 yields stable results later

    I didn't spend time playing with it too much , but for the CPU only this should work , for the GPU version just change gpu:1 and cubic:1 .

    Code:
    function SmoothFPS2_modified(clip source, threads) { 
      super_params="{pel:2,gpu:0,gpuid:12}"
      analyse_params="""{
            block:{w:16,h:16,overlap:2}, 
    	main:{search:{distance:0}},
    	refine:[{thsad:200}]
            }"""
      smoothfps_params="{rate:{num:4,den:1,abs:false},scene:{mode:0,limits:{scene:8500}},algo:21,cubic:0}"
      threads = 1
    
      super =   SVSuper(source,super_params)
      vectors = SVAnalyse(super, analyse_params)
      SVSmoothFps(source,super,  vectors,  smoothfps_params,  url="www.svp-team.com",  mt=threads)
      }
    Thanks for that. It works well for that black background. But on a similar test video with a checkerboard background it sometimes gives really ugly errors. Artifacts around the ball are expected but note the disturbance of the checkerboard pattern below and to the left of the ball:

    Click image for larger version

Name:	corrupt.png
Views:	184
Size:	103.4 KB
ID:	35496

    I guess I'll have to play around with the settings some more.
    Quote Quote  
  14. Originally Posted by johnmeyer View Post
    Thanks, I'll take a look at that.
    Quote Quote  
  15. for slow motion:
    AssumeFPS(FrameRate/2, true).SSRC(AudioRate)
    ChangeFPS(FrameRate*2)
    *** DIGITIZING VHS / ANALOG VIDEOS SINCE 2001**** GEAR: JVC HR-S7700MS, TOSHIBA V733EF AND MORE
    Quote Quote  
  16. I haven't been able to find any combination of settings that get rid of the type of artifacts seen in post 13. If anyone want's to try:

    Code:
    function circular(clip bg, clip ovr, clip bmask, int ang)
    {
    	x = int(cos(ang*1*3.14159/360.0) * bg.height/4)
    	y = int(sin(ang*1*3.14159/360.0) * bg.height/4)
    
    	Overlay(bg, ovr, bg.width/2-ovr.width/2+x, bg.height/2-ovr.height/2+y, mask=bmask)
    }
    
    
    bg = BlankClip(width=64, height=64, fps=15, pixel_type="YV12", length=73)
    bg = StackVertical(StackHorizontal(bg, bg.Invert()), StackHorizontal(bg.Invert(), bg))
    bg = StackVertical(StackHorizontal(bg, bg), StackHorizontal(bg,bg))
    bg = StackVertical(StackHorizontal(bg, bg), StackHorizontal(bg,bg))
    bg = StackVertical(StackHorizontal(bg, bg), StackHorizontal(bg,bg))
    ovr = ImageSource("ball.png", start=0, end=72, fps=15).ConvertToYV12(matrix="PC.601")
    bmask = ovr.ColorYUV(off_y=-1).ColorYUV(gain_y=100000)
    Animate(0,360, "circular", bg,ovr,bmask,0, bg,ovr,bmask,7200)
    
    SmoothFPS2_modified(last, 1) # or whatever you want
    Oh, I modified the ball image a bit to get rid of black pixels within the ball (so only the background is transparent in the generated mask). New image:

    Name:  ball.png
Views: 1480
Size:  127.2 KB
    Quote Quote  
  17. 1. I duplicated the problems with SVPFlow that you posted in your original example (black background). MVTools2 version worked correctly.

    2. I then copy/pasted poison's code and used it, and it worked.

    3. I went back to your SVPFlow code, and suddenly it too was working. I was so puzzled by this that I started over from scratch, extracting the files from your original zip, and couldn't get it to fail again. It appears as though something is being retained in memory.

    As for your checkerboard pattern, I reduced your example and tried it not only with the various SVPFlow and MVTools2 scripts you posted, but also with a much simpler one that is basically straight from the MVTools2 documentation.

    My conclusion: you have created a "pathological case," something that would almost never happen in the real world. Motion estimation has a huge problem with background objects which get revealed as something in the foreground moves in front of them. The higher the contrast between the moving foreground object and the background, the worse the problem. By using a pure white/black checkerboard, and then moving the ball "in front" of it, you have created a true torture test that, I think, ANY motion estimation software would fail. You could try it on AE, Twixtor, Motionperfect, etc. and I think they'll all show the same thing. I doubt very much you'll find any combination of settings that will make things any better.

    Of course you could take the results from your "ball on black background" and then overlay that onto the checkerboard and get a perfect result, but that would be cheating

    [edit] I was not able to duplicate that strange distortion in the PNG you posted, where the checkerboard was distorted a long way from the ball.
    Last edited by johnmeyer; 2nd Feb 2016 at 15:49. Reason: added last sentence
    Quote Quote  
  18. I think jagabo is specifically referring to the large displaced artifacts way way out of the motion path . You don't get those with the other mvtools2 variants, only the typical "edge morphing" localized artifacts. I can't figure it out either , and even went so far as to try a few older svpflow versions - and they all seem have that problem, despite trying a variety of settings. And if you reduce the artifacts with various, it starts to skip frames (duplicates) like the initial problem. Even using the MV's to plug into vanilla mvtools2 demonstrates the problem, so the problem is likely with svpflow's MV's

    All motion estimations algorithms do fail to an extent (at least with the localized edge "morphing", not the large displaced artifacts in svpflow), if run in "automatic" mode. But the "pro" versions typically require some additional inputs like motion tracking, mattes to guide the motion estimation to improve the results . And they do work quite cleanly for things like this because it's 1 object, no axial rotation, no deformation (a real "ball" would deform as it bounces, and spin), no motion blur, no noise, no compression artifacts etc... It's really a textbook case
    Quote Quote  
  19. Originally Posted by poisondeathray View Post
    I think jagabo is specifically referring to the large displaced artifacts way way out of the motion path.
    Yes, I expect to see distortions right around the moving ball. It's the distortions hundreds of pixels away that I'm referring to. With different settings I get them on different frames and in different locations. But nothing I tried was able to eliminate them completely. Apparently the filter is occasionally mis-matching the repetitive background and interpreting that as motion. When you see it frame by frame it's obvious that that area of pixels is moving to the right and down by one background block (in the image shown it's moved half way). On a background without a repetitive pattern (for example, using an enlarged version of the ball as the background) it didn't have the same problem.

    And yes, these are edge cases. I often use simple tests/patterns like these to get a feel for how filters work and what effect the parameters have. The ball on the black background was originally used to see how far the ball could move between frames and still be seen as movement by mvtools2. The black background was used so there would be no background details to confuse the filter.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!