Using:
Vapoursynth Preview works fine here.Code:clip = vsrvrt.Deblur(clip, device="cuda", preview_mode=True)
During encoding VRAM was at 10.5GB and ran @1.42fps.
The script that was created by Hybrid:Code:encoded 149 frames, 1.42 fps, 8498.35 kbps, 5.02 MB encode time 0:01:44, CPU: 0.1, GPU: 51.7, GPUClock: 2775MHz, VEClock: 2160MHz frame type IDR 1 frame type I 1, total size 0.08 MB frame type P 37, total size 0.00 MB frame type B 111, total size 4.95 MB
So it's not a general problem.Code:# Imports import vapoursynth as vs # getting Vapoursynth core import sys import os core = vs.core # Limit frame cache to 48449MB core.max_cache_size = 48449 # Import scripts folder scriptPath = 'F:/Hybrid/64bit/vsscripts' sys.path.insert(0, os.path.abspath(scriptPath)) # loading plugins core.std.LoadPlugin(path="F:/Hybrid/64bit/vsfilters/SourceFilter/DGDecNV/DGDecodeNV_AVX2.dll") # Import scripts import validate # Source: 'C:\Users\Selur\Desktop\VID_20251026_113323.mp4' # Current color space: YUV420P8, bit depth: 8, resolution: 1920x1080, frame rate: 30.048fps, scanorder: progressive, yuv luminance scale: limited, matrix: 709, transfer: bt.709, primaries: bt.709, format: HEVC # Loading 'C:\Users\Selur\Desktop\VID_20251026_113323.mp4' using DGSource clip = core.dgdecodenv.DGSource("J:/tmp/mp4_6694054a41f976fc6e415da2ab1c7fab_853323747.dgi") # 30.048 fps, scanorder: progressive frame = clip.get_frame(0) # setting color matrix to 709. clip = core.std.SetFrameProps(clip, _Matrix=vs.MATRIX_BT709) # setting color transfer (vs.TRANSFER_BT709), if it is not set. if validate.transferIsInvalid(clip): clip = core.std.SetFrameProps(clip=clip, _Transfer=vs.TRANSFER_BT709) # setting color primaries info (to vs.PRIMARIES_BT709), if it is not set. if validate.primariesIsInvalid(clip): clip = core.std.SetFrameProps(clip=clip, _Primaries=vs.PRIMARIES_BT709) # setting color range to TV (limited) range. clip = core.std.SetFrameProps(clip=clip, _ColorRange=vs.RANGE_LIMITED) # making sure frame rate is set to 30.048fps clip = core.std.AssumeFPS(clip=clip, fpsnum=3756, fpsden=125) # making sure the detected scan type is set (detected: progressive) clip = core.std.SetFrameProps(clip=clip, _FieldBased=vs.FIELD_PROGRESSIVE) # scan type: progressive # adjusting color space from YUV420P8 to RGB24 for vsRVRTFilter clip = core.resize.Bicubic(clip=clip, format=vs.RGB24, matrix_in_s="709", range_in_s="limited", range_s="full") # Debluring using RVRT import vsrvrt clip = vsrvrt.Deblur(clip, device="cuda", preview_mode=True) # adjusting output color from RGB24 to YUV420P10 for NVEncModel clip = core.resize.Bicubic(clip=clip, format=vs.YUV420P10, matrix_s="709", range_in_s="full", range_s="limited", dither_type="error_diffusion") # additional resize to match target color sampling # output clip.set_output()
Cu Selur
Ps.: going to bed now, n8.![]()
+ Reply to Thread
Results 2,431 to 2,436 of 2436
-
users currently on my ignore list: deadrats, Stears555, marcorocchini
-
-
@selur: i could use now the newest dev stuff from yesterday and it did still not work, same issue as postet above. Here is the vapoursynth code:
But: Thanks for your video: I could compare frame by frame and sadly rvrt is AI-Shit compared to original Camera-footage. It looked not good in my opinion. The denoise-video i saw in discord did good result then (i have to try that another time), but the deblur is sth i do not like. i prefer blurry over the "AI-Sharp". The blurry images looked after deblur not good and the actually "sharp" images look worse to after deblurring. But that's my opinion.Code:# Imports import vapoursynth as vs # getting Vapoursynth core import sys import os core = vs.core # Import scripts folder scriptPath = 'C:/Program Files/Hybrid/64bit/vsscripts' sys.path.insert(0, os.path.abspath(scriptPath)) # loading plugins core.std.LoadPlugin(path="C:/Program Files/Hybrid/64bit/vsfilters/SourceFilter/LSmashSource/LSMASHSource.dll") # Import scripts import validate # Source: 'C:\Users\Gaming-Tower\Pictures\LocalSend\Stabilisation-Tests\Verschiedene-Stabilizer-Test\Video_3\VID_20251026_113323_VidStabGui-11frames_5degree.mp4' # Current color space: YUV420P8, bit depth: 8, resolution: 1920x1080, frame rate: 30fps, scanorder: progressive, yuv luminance scale: limited, matrix: 709, transfer: bt.709, primaries: bt.709, format: AVC # Loading 'C:\Users\Gaming-Tower\Pictures\LocalSend\Stabilisation-Tests\Verschiedene-Stabilizer-Test\Video_3\VID_20251026_113323_VidStabGui-11frames_5degree.mp4' using LibavSMASHSource clip = core.lsmas.LibavSMASHSource("C:/Users/Gaming-Tower/Pictures/LocalSend/Stabilisation-Tests/Verschiedene-Stabilizer-Test/Video_3/VID_20251026_113323_VidStabGui-11frames_5degree.mp4") frame = clip.get_frame(0) # setting color matrix to 709. clip = core.std.SetFrameProps(clip, _Matrix=vs.MATRIX_BT709) # setting color transfer (vs.TRANSFER_BT709), if it is not set. if validate.transferIsInvalid(clip): clip = core.std.SetFrameProps(clip=clip, _Transfer=vs.TRANSFER_BT709) # setting color primaries info (to vs.PRIMARIES_BT709), if it is not set. if validate.primariesIsInvalid(clip): clip = core.std.SetFrameProps(clip=clip, _Primaries=vs.PRIMARIES_BT709) # setting color range to TV (limited) range. clip = core.std.SetFrameProps(clip=clip, _ColorRange=vs.RANGE_LIMITED) # making sure frame rate is set to 30fps clip = core.std.AssumeFPS(clip=clip, fpsnum=30, fpsden=1) # making sure the detected scan type is set (detected: progressive) clip = core.std.SetFrameProps(clip=clip, _FieldBased=vs.FIELD_PROGRESSIVE) # scan type: progressive # adjusting color space from YUV420P8 to RGB24 for vsRVRTFilter clip = core.resize.Bicubic(clip=clip, format=vs.RGB24, matrix_in_s="709", range_in_s="limited", range_s="full") # Debluring using RVRT import vsrvrt clip = vsrvrt.Deblur(clip, device="cuda", preview_mode=True) # adjusting output color from RGB24 to YUV420P10 for x265Model clip = core.resize.Bicubic(clip=clip, format=vs.YUV420P10, matrix_s="709", range_in_s="full", range_s="limited", dither_type="error_diffusion") # additional resize to match target color sampling # set output frame rate to 30fps (progressive) clip = core.std.AssumeFPS(clip=clip, fpsnum=30, fpsden=1) # output clip.set_output() # script was created by Hybrid 2026.03.27.1
These papers on github always use sooo low resolution-examples, that you can't see any artifacts.
(I thought i could fix motion blur from camera-shaking. After stabilizeing there are some frames which are blurry because of the motion blurr from the shaky handheld. You don't know sth to fix that or make better at least without makign it look AI? I think i will open a question in another thread) -
GRLIR and BasivVSR++ both have deblur modes, but you probably won't be happy with them either. Only other solution that comes to mind are difusion based solutions like seedvr2 or topaz starlight. (don't know about starlight, but seedvr2 is really resource hungry, so if rvrt doesn't work on your system, they might not work either)
Usually when stuff doesn't work and one gets strange errors it's down to VRAM or drivers.
Does RVRT work if you call it on SD resolution content? (just to know whether it' a general problem)
Cu Selurusers currently on my ignore list: deadrats, Stears555, marcorocchini -
Ok, i did now resize the video with hybrid to 480x270 pixel (and then used that video in a second encode) and it did still crash. I think when you have 10gb VRAM on 1080p is should not run out of vram with a rtx5080 16GB. It ist still the same error:
And here is the code:Code:2026-03-28 14:48:23.731 C:\Program Files\Hybrid\64bit\Vapoursynth\Lib\site-packages\torch\functional.py:554: UserWarning: torch.meshgrid: in an upcoming release, it will be required to pass the indexing argument. (Triggered internally at C:\actions-runner\_work\pytorch\pytorch\pytorch\aten\src\ATen\native\TensorShape.cpp:4316.) return _VF.meshgrid(tensors, **kwargs) # type: ignore[attr-defined] 2026-03-28 14:48:23.806 Failed to evaluate the script: Python exception: Ran out of input Traceback (most recent call last): File "src/cython/vapoursynth.pyx", line 3393, in vapoursynth._vpy_evaluate File "src/cython/vapoursynth.pyx", line 3394, in vapoursynth._vpy_evaluate File "C:\Users\Gaming-Tower\Documents\Hybrid\Tempfolder\tempPreviewVapoursynthFile14_48_21_643.vpy", line 38, in clip = vsrvrt.Deblur(clip, device="cuda", preview_mode=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Program Files\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsrvrt\rvrt_filter.py", line 857, in Deblur return _create_filter_wrapper( ^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Program Files\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsrvrt\rvrt_filter.py", line 614, in _create_filter_wrapper return _create_filter_wrapper_preview( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Program Files\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsrvrt\rvrt_filter.py", line 429, in _create_filter_wrapper_preview inference = RVRTInference(config, use_fp16=use_fp16, device=device_obj) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Program Files\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsrvrt\rvrt_core.py", line 53, in __init__ self.model = self._load_model() ^^^^^^^^^^^^^^^^^^ File "C:\Program Files\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsrvrt\rvrt_core.py", line 143, in _load_model checkpoint = torch.load(model_path, map_location="cpu", weights_only=False) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Program Files\Hybrid\64bit\Vapoursynth\Lib\site-packages\torch\serialization.py", line 1549, in load return _legacy_load( ^^^^^^^^^^^^^ File "C:\Program Files\Hybrid\64bit\Vapoursynth\Lib\site-packages\torch\serialization.py", line 1797, in _legacy_load magic_number = pickle_module.load(f, **pickle_load_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ EOFError: Ran out of input
And i uploaded also the error-picture. DO you also need a debug ? Hybrid gui says Hybrid 2026.03.27.1 and i did download yesterday the vapoursynth and vsmlrt experimental version from your server and install everything (overwrite-copy).Code:# Imports import vapoursynth as vs # getting Vapoursynth core import sys import os core = vs.core # Import scripts folder scriptPath = 'C:/Program Files/Hybrid/64bit/vsscripts' sys.path.insert(0, os.path.abspath(scriptPath)) # loading plugins core.std.LoadPlugin(path="C:/Program Files/Hybrid/64bit/vsfilters/SourceFilter/LSmashSource/LSMASHSource.dll") # Import scripts import validate # Source: 'C:\Users\Gaming-Tower\Pictures\LocalSend\Stabilisation-Tests\Verschiedene-Stabilizer-Test\Video_3\Eigene\VID_20251026_113323_resizetest480.mkv' # Current color space: YUV420P10, bit depth: 10, resolution: 480x270, frame rate: 30.046fps, scanorder: progressive, yuv luminance scale: limited, matrix: 709, transfer: bt.709, primaries: bt.709, format: HEVC # Loading 'C:\Users\Gaming-Tower\Pictures\LocalSend\Stabilisation-Tests\Verschiedene-Stabilizer-Test\Video_3\Eigene\VID_20251026_113323_resizetest480.mkvÄ using LWLibavSource clip = core.lsmas.LWLibavSource(source="C:/Users/Gaming-Tower/Pictures/LocalSend/Stabilisation-Tests/Verschiedene-Stabilizer-Test/Video_3/Eigene/VID_20251026_113323_resizetest480.mkv", format="YUV420P10", stream_index=0, cache=0,repeat=True, prefer_hw=0) frame = clip.get_frame(0) # setting color matrix to 709. clip = core.std.SetFrameProps(clip, _Matrix=vs.MATRIX_BT709) # setting color transfer (vs.TRANSFER_BT709), if it is not set. if validate.transferIsInvalid(clip): clip = core.std.SetFrameProps(clip=clip, _Transfer=vs.TRANSFER_BT709) # setting color primaries info (to vs.PRIMARIES_BT709), if it is not set. if validate.primariesIsInvalid(clip): clip = core.std.SetFrameProps(clip=clip, _Primaries=vs.PRIMARIES_BT709) # setting color range to TV (limited) range. clip = core.std.SetFrameProps(clip=clip, _ColorRange=vs.RANGE_LIMITED) # making sure frame rate is set to 30.046fps clip = core.std.AssumeFPS(clip=clip, fpsnum=15023, fpsden=500) # making sure the detected scan type is set (detected: progressive) clip = core.std.SetFrameProps(clip=clip, _FieldBased=vs.FIELD_PROGRESSIVE) # scan type: progressive clip = core.std.AddBorders(clip=clip, left=0, right=0, top=0, bottom=2) # add borders to archive mod 8 (vsRVRTFilter) - 480x272 # adjusting color space from YUV420P10 to RGBS for vsRVRTFilter clip = core.resize.Bicubic(clip=clip, format=vs.RGBS, matrix_in_s="709", range_in_s="limited", range_s="full", dither_type="error_diffusion") # Debluring using RVRT import vsrvrt clip = vsrvrt.Deblur(clip, device="cuda", preview_mode=True) clip = core.std.Crop(clip=clip, left=0, right=0, top=0, bottom=2) # removing added borders from mod requirement (vsRVRTFilter) - 480x270 # adjusting output color from RGBS to YUV420P10 for x265Model clip = core.resize.Bicubic(clip=clip, format=vs.YUV420P10, matrix_s="709", range_in_s="full", range_s="limited") # additional resize to match target color sampling # output clip.set_output() # script was created by Hybrid 2026.03.27.1
Similar Threads
-
vp9 vs x265 vs DivX265
By deadrats in forum Video ConversionReplies: 14Last Post: 28th Jun 2015, 09:48 -
HEVC-x265 player in linux?
By racer-x in forum LinuxReplies: 4Last Post: 20th Mar 2014, 18:10 -
Hybrid [x264/XViD - MKV/MP4] Converter Support Thread
By Bonie81 in forum Video ConversionReplies: 6Last Post: 8th Jan 2013, 03:53 -
VP8 vs x264
By Selur in forum Video ConversionReplies: 14Last Post: 14th Apr 2012, 07:48 -
How often do you reinstall your operating system(windows,mac,linux etc..)?
By johns0 in forum PollsReplies: 28Last Post: 22nd Jan 2011, 17:14


Quote
