VideoHelp Forum




+ Reply to Thread
Results 1 to 29 of 29
  1. Hi, i need help with stabilization a video: I tried already Davinci and Cuvista (github software) and it did not do the job good enough. Either it doesn't stabilize the image enough, or it creates artifacts, or it crops the image too much. So i wanted to ask you guys, if you have any advice ? The video is 4k120fps. So it has to work for such a task. I have rtx5080 and intel 13900k and 32GB CPU RAM. So i cant use one of these researcher-models where you need 100gb for 480p videos. Gyroflow dows not work good (artifacts). I guess the reasion is, that i want to use my smartphone-camera-app and i cant disable OIS. but i need to use the camera-app, because blackmagic oder motioncam can't use all my camera-features. And i'am using a scope on my phone with a special case, where you can mount the scope directly over the lens (it does not wobble or so, it is very stable). Please no discussion, about gimbal or sth like that. Its about stabilisation afterwards.

    Does anyone has any idea?
    The Video is uploaded.

    Edit: I also tried vid-stab-gui, but all these "classical" things are not working good enough. I tried "Stabilization Camera" (android app), and it did the best job so far (basically no artifacts, but the stabilization-effect could be a bit better). Problem: It needs like half an hour on a high-end smartphone and on pc i can't use it with full perfomance of pc (i even tried android on windows and emulators and such stuff. but it does not work good).
    Image Attached Files
    Last edited by Platos; 19th Mar 2026 at 12:48.
    Quote Quote  
  2. StabilizeIT (AviSynth/VapourSynth)
    Code:
    clip = fromDoom9.StabilizeIT(clip, div=1.00, rotMax=0.00, thSCD1=8000, thSCD2=2000, cutOff=0.01, anaError=100.00)
    clip = core.std.Crop(clip, left=112, right=112, top=88, bottom=80) # cropping to 3616x1992 too lazy to check the minimal crop values ;)
    seems fine,... (I used VapourSynth, but the same can be done with AviSynth)

    Cu Selur
    Image Attached Files
    users currently on my ignore list: deadrats, Stears555, marcorocchini
    Quote Quote  
  3. Member
    Join Date
    May 2005
    Location
    Australia-PAL Land
    Search Comp PM
    Pretty slick there, Selur.
    Quote Quote  
  4. I wanted to try this. But it crashed on hybrid: I used hybrid 2026.03.16.1 with newest addons then.

    It does crash. Preview also does not open.

    Code:
    # Imports
    import vapoursynth as vs
    # getting Vapoursynth core
    import sys
    import os
    core = vs.core
    # Import scripts folder
    scriptPath = 'C:/Program Files/Hybrid/64bit/vsscripts'
    sys.path.insert(0, os.path.abspath(scriptPath))
    # loading plugins
    core.std.LoadPlugin(path="C:/Program Files/Hybrid/64bit/vsfilters/Support/libmvtools.dll")
    core.std.LoadPlugin(path="C:/Program Files/Hybrid/64bit/vsfilters/DenoiseFilter/ZSmooth/zsmooth.dll")
    core.std.LoadPlugin(path="C:/Program Files/Hybrid/64bit/vsfilters/SourceFilter/LSmashSource/LSMASHSource.dll")
    # Import scripts
    import fromDoom9
    import validate
    # Source: 'C:\Users\Gaming-Tower\Pictures\LocalSend\Stabilisation-Tests\Hybrid-Test\VID_20251101_135145.mp4'
    # Current color space: YUV420P8, bit depth: 8, resolution: 3840x2160, frame rate: 120.043fps, scanorder: progressive, yuv luminance scale: limited, matrix: 709, transfer: bt.709, primaries: bt.709, format: HEVC
    # Loading 'C:\Users\Gaming-Tower\Pictures\LocalSend\Stabilisation-Tests\Hybrid-Test\VID_20251101_135145.mp4' using LibavSMASHSource
    clip = core.lsmas.LibavSMASHSource("C:/Users/Gaming-Tower/Pictures/LocalSend/Stabilisation-Tests/Hybrid-Test/VID_20251101_135145.mp4")
    frame = clip.get_frame(0)
    # setting color matrix to 709.
    clip = core.std.SetFrameProps(clip, _Matrix=vs.MATRIX_BT709)
    # setting color transfer (vs.TRANSFER_BT709), if it is not set.
    if validate.transferIsInvalid(clip):
      clip = core.std.SetFrameProps(clip=clip, _Transfer=vs.TRANSFER_BT709)
    # setting color primaries info (to vs.PRIMARIES_BT709), if it is not set.
    if validate.primariesIsInvalid(clip):
      clip = core.std.SetFrameProps(clip=clip, _Primaries=vs.PRIMARIES_BT709)
    # setting color range to TV (limited) range.
    clip = core.std.SetFrameProps(clip=clip, _ColorRange=vs.RANGE_LIMITED)
    # making sure frame rate is set to 120.043fps
    clip = core.std.AssumeFPS(clip=clip, fpsnum=120043, fpsden=1000)
    # making sure the detected scan type is set (detected: progressive)
    clip = core.std.SetFrameProps(clip=clip, _FieldBased=vs.FIELD_PROGRESSIVE) # scan type: progressive
    # stabilizing using StabilizeIT
    clip = fromDoom9.StabilizeIT(clip, thSCD2=140)
    # adjusting output color from YUV420P8 to YUV420P10 for x265Model
    clip = core.resize.Bicubic(clip=clip, format=vs.YUV420P10, dither_type="error_diffusion")
    # output
    clip.set_output()
    # script was created by Hybrid 2026.03.16.1
    Image Attached Files
    Quote Quote  
  5. The Vapoursynth script seems fine to me.
    Does the Vapoursynth Preview report an error? If there is no error, does replacing the "C:/Program Files/Hybrid/64bit/vsfilters/Support/libmvtools.dll" with "C:/Program Files/Hybrid/64bit/vsfilters/Support/libmvtools_old.dll" fix it for you?
    If there is an error message, what does it say?

    Cu Selur

    Ps.: Also try latest dev.
    Last edited by Selur; 20th Mar 2026 at 13:53.
    users currently on my ignore list: deadrats, Stears555, marcorocchini
    Quote Quote  
  6. yes, it fixed it. i did rename the old one to the other filename and deletet the original before. It works now.

    But very slow. CPU-Usage is not good. Only some cores are in High-Usage. 0.6FPS

    Which settings did you use? Did you have also so slow perfomance?
    Quote Quote  
  7. Like I mentioned, I used:
    Code:
    clip = fromDoom9.StabilizeIT(clip, div=1.00, rotMax=0.00, thSCD1=8000, thSCD2=2000, cutOff=0.01, anaError=100.00)

    Divisor 1 and cutOff=0.01 might slow things down.
    You might get away with different settings, but you will have to test that.
    0.6 fps seems too slow, I got:
    Code:
    encoded 570 frames, 82.48 fps, 42049.44 kbps, 23.85 MB
    (encoding with NVEnc+AV1)
    The whole script I used:
    Code:
    # Imports
    import vapoursynth as vs
    # getting Vapoursynth core
    import sys
    import os
    core = vs.core
    # Limit frame cache to 48449MB
    core.max_cache_size = 48449
    # Import scripts folder
    scriptPath = 'F:/Hybrid/64bit/vsscripts'
    sys.path.insert(0, os.path.abspath(scriptPath))
    # loading plugins
    core.std.LoadPlugin(path="F:/Hybrid/64bit/vsfilters/Support/libmvtools.dll")
    core.std.LoadPlugin(path="F:/Hybrid/64bit/vsfilters/DenoiseFilter/ZSmooth/zsmooth.dll")
    core.std.LoadPlugin(path="F:/Hybrid/64bit/vsfilters/SourceFilter/DGDecNV/DGDecodeNV_AVX2.dll")
    # Import scripts
    import fromDoom9
    import validate
    # Source: 'G:\clips\shaking\VID_20251101_135145_noSound.mkv'
    # Current color space: YUV420P8, bit depth: 8, resolution: 3840x2160, frame rate: 119.823fps, scanorder: progressive, yuv luminance scale: limited, matrix: 709, transfer: bt.709, primaries: bt.709, format: HEVC
    # Loading 'G:\clips\shaking\VID_20251101_135145_noSound.mkv' using DGSource
    clip = core.dgdecodenv.DGSource("J:/tmp/mkv_ea1036fe23cb8bdcd92cbe607efd44b1_853323747.dgi") # 119.823 fps, scanorder: progressive
    frame = clip.get_frame(0)
    # setting color matrix to 709.
    clip = core.std.SetFrameProps(clip, _Matrix=vs.MATRIX_BT709)
    # setting color transfer (vs.TRANSFER_BT709), if it is not set.
    if validate.transferIsInvalid(clip):
      clip = core.std.SetFrameProps(clip=clip, _Transfer=vs.TRANSFER_BT709)
    # setting color primaries info (to vs.PRIMARIES_BT709), if it is not set.
    if validate.primariesIsInvalid(clip):
      clip = core.std.SetFrameProps(clip=clip, _Primaries=vs.PRIMARIES_BT709)
    # setting color range to TV (limited) range.
    clip = core.std.SetFrameProps(clip=clip, _ColorRange=vs.RANGE_LIMITED)
    # making sure frame rate is set to 119.823fps
    clip = core.std.AssumeFPS(clip=clip, fpsnum=119823, fpsden=1000)
    # making sure the detected scan type is set (detected: progressive)
    clip = core.std.SetFrameProps(clip=clip, _FieldBased=vs.FIELD_PROGRESSIVE) # scan type: progressive
    # stabilizing using StabilizeIT
    clip = fromDoom9.StabilizeIT(clip, div=1.00, rotMax=0.00, thSCD1=8000, thSCD2=2000, cutOff=0.01, anaError=100.00)
    clip = core.std.Crop(clip, left=112, right=112, top=88, bottom=80) # cropping to 3616x1992
    # adjusting output color from YUV420P8 to YUV420P10 for NVEncModel
    clip = core.resize.Bicubic(clip=clip, format=vs.YUV420P10, dither_type="error_diffusion")
    # output
    clip.set_output()
    # script was created by Hybrid 2026.03.20.1
    Atm. you are using:
    Code:
    clip = core.lsmas.LibavSMASHSource("C:/Users/Gaming-Tower/Pictures/LocalSend/Stabilisation-Tests/Hybrid-Test/VID_20251101_135145.mp4")
    You might want to enable hardware decoding, by either:
    a. using DGDecNV
    b. using LWLibAVSource with libav hardware decoding mode 1
    c. using BestSource with i.e. opencl or cuda
    atm. you do the decoding of the source through software decoding.

    Cu Selur

    Ps.: also test the latest dev, it probably has another libmvtools.dll (which works for me and is faster)
    Last edited by Selur; 20th Mar 2026 at 14:52.
    users currently on my ignore list: deadrats, Stears555, marcorocchini
    Quote Quote  
  8. Ah, i did not know i can encode with cpu, but decode with gpu. I will try that and the new dev. But i probably need some time to make it work. Thanks for the tipps.
    Quote Quote  
  9. Nice, it did the Encode 3-3.5x faster (still with cpu). I can do NvEnc for sure, but it's nice anyway. Thank you for that. Have to use that always now.

    Anyway: About AV1: Do you plan to implement "https://github.com/juliobbv-p/svt-av1-hdr" into Hybrid? Because with that you can use the Parameter "Tune=5" which does make AV1 finally look not washed out. With that it looks as sharp like x265 CPU. Not sure, if you can use that with NvEnc, but with CPU it does a good job. I just remember that, because i wanted to ask you that some days ago.

    Edit: The new dev works with the StabilizeIT. Thanks.
    Quote Quote  
  10. So is the result satisfying with StabilizeIT? Years ago, as I wasn't satisfied with the stabilizer features in Magic Video Deluxe (the basic one worked usually fine but left artifacts I couldn't get rid of, I found the stabilization inferior with the supposedly more advanced Mercalli plugin), I tried the Avisynth function DePanStabilize which produced a poor result, very unnatural looking, then finally opted for Deshaker (VirtualDub plugin) which did a fine job (although it complicated the workflow quite a bit).
    Quote Quote  
  11. @Plantos: Happy to hear the dll in the dev worked.

    About the SVT-AV1-HDR development branch:
    No, I plan on waiting for the changes from there to hit the mainline. That said, if you build a binary from that development branch yourself, you can replace the existing one with it and add the new options via 'STV-AV1=>Advanced=>Custom addition'.

    Abput NVEncC:
    NVEncC has none of the tune options that svt-av1 has, since those are:
    a. defined in the standard
    b. not offered by the encoder chip
    You can see the available options of NVEncC over at: https://github.com/rigaya/NVEnc/blob/master/NVEncC_Options.en.md

    Cu Selur
    users currently on my ignore list: deadrats, Stears555, marcorocchini
    Quote Quote  
  12. Ah ok, thanks.

    When i use StabilizeIT i get black bars wobbling around. Normally on stabilizing i can do a static/dynamic crop. How can i do a auto-static crop? It should automaticly crop as far, as it is needed for not having these black bars wobbling around.
    Quote Quote  
  13. StabilizeIT does not offer automatic cropping.
    Like I did in my example, you will have to crop manually.

    side note: There are autocrop functions for both AviSynth and VapourSynth, but afaik those will not help. Their goal usually is, to remove black bars only as far as possible while not to remove active content.

    Cu Selur
    users currently on my ignore list: deadrats, Stears555, marcorocchini
    Quote Quote  
  14. Member
    Join Date
    Aug 2018
    Location
    Wrocław
    Search PM
    Premiere - Warp Stabilizer
    Mercalii - for Premiere, Vegas, Resolve

    Warp is better overall.
    Quote Quote  
  15. Out of curiosity: can others posts examples what other tools produce of this clip in regard of stabilization?
    users currently on my ignore list: deadrats, Stears555, marcorocchini
    Quote Quote  
  16. Member
    Join Date
    Aug 2018
    Location
    Wrocław
    Search PM
    Mercalli, Warp

    It would be better to give less stabilization, because you can see how it turned out.
    Additionally, at 60fps the image would be sharper and easier to stabilize.
    Image Attached Files
    Last edited by rgr; 23rd Mar 2026 at 17:17.
    Quote Quote  
  17. @rgr: thanks
    users currently on my ignore list: deadrats, Stears555, marcorocchini
    Quote Quote  
  18. Im not interested in paid-stuff. I'm doing now (or the next days) some test with different free stabilizer.

    Originally Posted by Selur View Post
    StabilizeIT does not offer automatic cropping.
    Like I did in my example, you will have to crop manually.

    side note: There are autocrop functions for both AviSynth and VapourSynth, but afaik those will not help. Their goal usually is, to remove black bars only as far as possible while not to remove active content.

    Cu Selur
    Ah ok, yeah then i guess i can't use it. Too much time waste. But thanks for your help
    Quote Quote  
  19. Have you tried Deshaker? The only drawback, if I remember correctly, is that it required a (lossy) colorspace conversion, accepting only RGB input (could have changed since but unlikely). Since most NLEs work in RGB internally that's not too much of an issue, if proceeding as such: use a simple Avisynth script to do the YUV => RGB conversion, process clips (frameserved from Avisynth) with Deshaker, export in RGB as lossless intermediate clips (years ago the most recommended option for editing purposes was UT Video), do the editing, then render in YUV as the intended delivery format, most likely HEVC (or render as lossless YUV then encode with x265 to have better control over the outcome of the compression – years ago the consensus was that x264 performed better than the MainConcept AVC encoder featured in most NLEs, I don't know if that's still true today with x265 vs. whatever prominent NLEs are using to encode to HEVC). But lossless intermediate clips in 4K at 120 FPS are going to be huge...

    Side question:
    Those videos are playing with very jerky motion (jerky as in: the opposite of smooth, as in I can barely see a vague image of a horse during the 5 seconds it lasts) on my computer, based on an Intel i7 6700K CPU, with no dedicated GPU, using VLC Media Player (I get CPU usage of ~55% during playback). Is that normal? If so, what would be the minimal requirements to play that kind of video smoothly?
    When I assembled that computer I thought that I wouldn't have to worry about performance issues for a long, long time (before that I used to settle for mid-range components, praised for their bang-for-the-buck factor, like the Athlon XP1600, then the Pentium E5200), and yet I started experiencing some hiccups not too long afterwards... and now it's struggling to merely play a video properly... (I normally can play 4K HEVC smoothly, but I had never tried at 120 FPS.)
    Quote Quote  
  20. I tried it. I compared it to other Stabilisation Software on same crop-level and it was way more shaky than the others. That's why i don't use it.

    And about Player: You mean after stabilize or generally? On VLC Media Player it works fine for me. On MPC-HC it have huge lag. I guess MPC-HC does more Post-Processing or VLC does not really show all frames on my side. I don't know. But the Video is 4k120p so i guess it is "normal" that you need a bit of a good CPU. But anyway: Your iGPU is quite old, so maybe that's the reason. The were not made for 120fps 4k these days. You can be glad if you can wathc 4k60 with these.
    Quote Quote  
  21. Yes, I meant just playing those files as-is (with no stabilization) I can't get a smooth playback. And yeah, generally speaking, perhaps relying on the integrated GPU for both high resolution and high framerate footage is pushing it (and even more so one that's about 10 years old).

    As for Deshaker, not sure why you got a poor result. Perhaps it's not optimized for high resolution footage. Perhaps changing a few settings would produce a much improved outcome, but it's been a long time since I've used it, so I couldn't recommend specific tweaks.
    Perhaps you'll get more in-depth guidance on that topic at Doom9 (don't know if it's still as active as it used to be).
    Quote Quote  
  22. Member
    Join Date
    Aug 2018
    Location
    Wrocław
    Search PM
    Originally Posted by Platos View Post
    Im not interested in paid-stuff. I'm doing now (or the next days) some test with different free stabilizer
    The biggest problem with stabilization is rolling shutter compensation (Google). Very few stabilizers have this, and even fewer do it well.
    Quote Quote  
  23. Originally Posted by rgr View Post
    Originally Posted by Platos View Post
    Im not interested in paid-stuff. I'm doing now (or the next days) some test with different free stabilizer
    The biggest problem with stabilization is rolling shutter compensation (Google). Very few stabilizers have this, and even fewer do it well.
    Ah ok und which free Software can do that (and which of them are good in doing that). Or other question: Which method to correct roling shutter compensation is good and which one is bad? are there actually different methods to fix rolign shutter?
    Quote Quote  
  24. Member
    Join Date
    Aug 2018
    Location
    Wrocław
    Search PM
    Free? I don't know of any (besides Deshaker from VirtualDub). If you have a GoPro, you can access Hyper-something from the Windows Store.
    You can try enabling it in Deshaker, it should be better.
    Quote Quote  
  25. Ok, i wil ltry, but i see the option there is in percantage % and not milisecond. How do i have to understand that? Normally i have to do it with miliseconds. So for example 14.2 miliseconds Roling Shutter. I don't understand the percantage value.

    Im trying to make an own programm with it. Maybe it works and then i can compare it to deshaker 3.1 in virtualdub2
    Last edited by Platos; 25th Mar 2026 at 14:04.
    Quote Quote  
  26. Member
    Join Date
    Aug 2018
    Location
    Wrocław
    Search PM
    This is what it looks like in Deshaker.
    I thought these artifacts were the result of stabilization errors, but they're already present in the video.
    If you want to record shots like this and stabilize them, buy a real camera. Even a compact one would be fine, just not a smartphone, whose "AI" causes this.
    Besides, smartphones record in VFR, which makes editing even more difficult (dropped/duplicate frames, sync issues).


    Calculate using the following formula:
    (Sensor readout time in ms) / (1000 / fps) * 100 (%)
    Image Attached Files
    Quote Quote  
  27. The problem is not the Smartphone or AI or whatever. It is just a worstcase-video. I used a 16x Scope and did not hold steady and therefore it is very shaky. On Pro-Mode ther is no AI. Maybe i did not use it on this shot, maybe i did forget to use promode, could be possible. But it should not be the case actually.

    And about Roling Shutter: The question would be, how i can determine the sensor readout time.

    And no, i do not buy a "real" camera

    Mabye i upload another video which is not that a worst case.

    Edit: Uploaded a stabilized video with VidStabGui 11 Frames and optimal static zoom value and "slower" on encoding. Rest are default settings. Anyone get a better result with free software (with same crop level) ?
    Image Attached Files
    Last edited by Platos; 26th Mar 2026 at 12:18.
    Quote Quote  
  28. Member
    Join Date
    Aug 2018
    Location
    Wrocław
    Search PM
    This is a problem, as part of the frame has errors typical of poor processing. The sensor should have this in its specifications, it can be measured using image analysis (relatively accurately), or it's a guess. You can always buy a Sony Alpha III-9 and have this problem solved
    Quote Quote  
  29. Im not here to ask for help, that someone comes and tell me "do buy xyz, its better".

    And it has really nothing to do with bad processing (on the second video). Second video does not have these flickers. It have some motion blurr on a lot of pictures, because i did wiggle a lot (because with 16x scopes it is not easy to hold steady).

    Do you have any better result for the video i postet before? Maybe the first video is too hard crap. Maybe you are right and on the first video i forgot to use Pro-Mode or so. because normally it does not do this flicker.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!