Try StreamFab Downloader and download from Netflix, Amazon, Youtube! Or Try DVDFab and copy Blu-rays!
+ Reply to Thread
Results 31 to 50 of 50
Thread
-
sorry, I just typed it, without checking
Code:rgb = core.resize.Bicubic(clip , matrix_in_s = '709', format = vs.RGB24, range_in_s ='limited')
those '_s' at the end, mean that argument value is a string,
matrix, transfer and color primaries string values you can get from here,
without that '_s', it is an integer, from specs, around page 405
'_in' is for input, without it , it is output -
sorry, I just typed it, without checking
When I post code, unless I'm demonstrating a problem, it's been tested and works, guaranteed.
Code:>>> Traceback (most recent call last): File "C:\Archive\Python\test2.py", line 1, in <module> from vapoursynth import core ImportError: No module named 'vapoursynth' >>>
LOL easy, eh? -
There is always problems you should know that. Read the manual how to install everything. http://www.vapoursynth.com/doc/gettingstarted.html Take it slow.
-
If we're going to wind up with ffmpeg doing the actual video manipulation, why are we even bothering with Python and Vapoursynth? Why not just write an ffmpeg script?
You have to have patience in dealing with ffmpeg's quirks (to put it politely) and there is a lot of back-and-forth to get it just right, but it's doable and will handle the audio automatically.
Again, I am trying to conform to a standard designed for YRGB (EBU r 103) but a YUV deliverable is expected. Not easy. -
I though this topic was about creating some GUI for graphics and manipulations? So the values in the GUI (sliders, dials, whatever) would correspond to (be entered into) the vapoursynth code and then piped to ffmpeg or something else
You can use whatever you want to do the actual video manipulation, and in any permutation. eg. You can use vapoursynth to perform the manipulations and ffmpeg to write . Or even something else to write and skip ffmpeg (the binary) entirely, or parts in some, and parts in others.
Some of the filters actually share the same code, and with avisynth . Many are ported from avisynth , because many of the filters were "born" there first
There are ways to get things into other programs, even ones like premiere, resolve etc... through virtual files and avfs . Basically it enables you to link scripts to other software without encoding large intermediate files. It's like piping between programs that usually don't accept pipes
You have to have patience in dealing with ffmpeg's quirks (to put it politely) and there is a lot of back-and-forth to get it just right, but it's doable and will handle the audio automatically. -
Once again, Vapoursynth, avisynth and Python use ffmpeg to manipulate the video, do they not?
-
Visual Studios---> Visual Basic, C, C++, C# ---. Link to Linux? Compiles to EXE
Gui Creation:
I can't arrange or design a Shower curtain to look nice so the layout sucks, I know, but It's Gui, calls any external program you want to use without actually compiling the external utility and was created with Visual Basic.
[Attachment 47398 - Click to enlarge] -
My original idea is crashing and burning.
On my big computer at home it works OK except for that pesky flicker that makes the output video unusable.
I'm on the road now and my little tablet is having trouble setting up pipes and thus cannot read pixels out of a video file. Both the tablet and my home computer run Windows 10 so you would think if it runs on one it would run on the other, but no such luck.
It would have been nice to be able to adjust video levels interactively, and that part works well on my home computer until it comes time to export it. The interactive part is nice and responsive considering the large amount of computation it has to do per frame. But then we pick up that flicker when exporting the video that I can't get rid of, despite trying all sorts of fixes and workarounds.
It might be possible to use the interactive portion to set values to plug into a generic ffmpeg script. As I am manipulating pixel values directly in my program, one wonders if those numbers would give the same results when plugged into an ffmpeg script.
The scope part of my program works well. It reports the maximum and minimum values for Y, R, G and B so the user can tell if his video is in compliance. It also hard clips at the r 103 limits. Yes, I know there might be recoverable detail above 235, a lot of which is ringing artifacts. The user can either reduce the gain to bring this detail under 236 or use the camera iris or control the subject matter better. -
Can your Visual Basic program open pipes and communicate with ffmpeg that way, hopefully without flicker?
Here again is how Ted Burke does it in C under Linux:
https://batchloaf.wordpress.com/2017/02/12/a-simple-way-to-read-and-write-audio-and-vi...-part-2-video/ -
Not necessarily .
If you are using vapoursynth filters, avisynth filters, those manipulations are done in vapoursynth, avisynth respectively . That altered data is then sent somewhere though pipe. If you want, ffmpeg.exe, or something else like x264.exe. You need someway to write out the data , encoding into a physical video (usually)
Some filters use ffmpeg/libraries libraries. eg. some input filters can use ffmpeg to decode the video to uncompressed data . But ffmpeg.exe doesn't have to be used. -
But you're checking this on decoded YUV, converted to RGB intermediate step, right ? Not on the final submission ?
An additional problem that you probably don't want to hear about is when you convert it to your final YUV 422 submission format - you will generate non r103 friendly values on certain types of material even though your scopes look ok, your min/max values are ok, and you "think" it's compliant. Certain test patterns. Text. Graphics. Broadcast overlays. The subsampling step creates problems, and depending on what the validation method uses for upsampling algorithm to check RGB . -
The final 4:2:2 submission has to be checked. It has to be decoded to RGB to check for r 103 compliance. That's part of the user's workflow. No further processing is done after the final check.
-
I think what you are trying to say is that they use ffmpeg to read and write the video data, same as I and Ted Burke are doing.
Why are people pushing vapoursynth and Python on me? What is the big benefit of using them instead of ffmpeg and its filters or my own code? Cripes, I can't even get a functional example in "LOL easy" Python. All they're doing is muddying the waters. -
No . You can use ffmpeg if you want to. But it's not necessary to use ffmpeg for anything when using avisynth or vapoursynth
avisynth/vapoursynth is just used as a framework for intermediate steps . You can use it for the video manipulations. Since vapoursynth is python based, it lends itself to many possibilities for your graphics and GUI . The existing GUI's are somewhat lacking, but maybe you can contribute and help develop one
Why are people pushing vapoursynth and Python on me? What is the big benefit of using them instead of ffmpeg and its filters or my own code? Cripes, I can't even get a functional example in "LOL easy" Python. All they're doing is muddying the waters.
The benefit is more flexibility and filters available than ffmpeg, or ffmpeg filters . (eg. ffmpeg doesn't even have a simple levels filter for some reason) . You can do anything ffmpeg can plus a lot more . It's easier to filter sections and edit (you can specify certain different settings over certain ranges of video) .
You were looking for something that works, without flicker. I don't know why you were only getting some frames processed, but these are workflows proven to work. Even ffmpeg alone can sometimes have issue. In avs, vpy you can index source videos. It provides a higher level of frame accuracy, especially for seeking. With long GOP videos (I,B,P, not intra frame encoding) , seeking can cause frame mismatches and wrong frames, out of order. That might be partially contributing to your flicker issue .
If vapoursynth doesn't meet your needs, just carry on with what you're doing and try to debug it. There has to be a reason why it's not working
The final 4:2:2 submission has to be checked. It has to be decoded to RGB to check for r 103 compliance. That's part of the user's workflow. No further processing is done after the final check.Last edited by poisondeathray; 7th Dec 2018 at 09:41.
-
So ruling out API's. Sticking with -> load video. Simple like set sliders, press run to run a subprocess, get a raw image, put it on screen. But you need filter for that or custom made ffmpeg filter, like -filter_complex in cmd line with your math. Is it doable? Ask direct questions, like that, ffmpeg forums, stackoverflow . com, reveal your math, you might get lucky like that guy.
What about different setup, for example using opencv to fetch and fix values with math done in your language. And handling pixel values. If you insist that Vapoursynth is not for you and difficult. Is there opencv modul and codes to handle opencv in pure basic? You do not have to use opencv as a gui, just for RGB pixel handling.
I found out it here, it has 54 pages so perhaps it is doableLast edited by _Al_; 7th Dec 2018 at 13:39.
-
I downloaded and installed vapoursynth and Python 3.71. This code:
Code:from vapoursynth import core import vapoursynth as vs file=r'C:\Archive\Python\C0058.mp4' clip = core.lsmas.LWLibavSource(file) #checking what color space file is, if you do not know if clip.format.color_family == vs.ColorFamily.YUV: space = 'YUV' #there is also RGB, GRAY, YCOCG and COMPAT value = 10 Y_expr = 'x {value} -'.format(value=value) U_expr = '' V_expr = '' clip = core.std.Expr(clip, expr = [Y_expr,U_expr,V_expr]) #clip is YUV , so you need to get it on screen as RGB 8bit, assuming input is BT709: if space == 'YUV': rgb = core.resize.Bicubic(clip , matrix_in_s = '709', format = vs.RGB24, range ='limited') #to get a frame from your rgb, for example first frame(0 , pyhon indexes from 0), but it could be any frame : rgbFrame = rgb.get_frame(0) #now you have RGB24 image that most gui's can put on screen, you just need to know what array and how arranged it has to be, so it is gui specific further #to get it out thru vspipe.exe (encoding, piping it somewhere) you specify output: clip.set_otput()
==================== RESTART: C:\Archive\Python\test2.py ====================
Traceback (most recent call last):
File "C:\Archive\Python\test2.py", line 1, in <module>
from vapoursynth import core
ImportError: cannot import name 'core' from 'vapoursynth' (unknown location)
>>>
More filters and gewgaws are all well and fine, but if vapoursynth et al. are such hot shit then why don't they have built-in legalizers already?
That workflow where you convert to RGB, do filtering in RGB and look at scopes, hardclip , then export YUV422 is potentially problematic . -
You don't have to use it if you don't want to.
Fair warning: expect many more error messages and hair pulling with debugging trying to get it to work. It's seriously painful at first. It's the same when someone first uses anything, like ffmpeg too for the first time . I had to revisit avisynth a few times when first learning it. It wasn't a pleasant experience. Vapoursynth becomes slightly easier if you know avisynth. At least you have some programming background, it shouldn't be difficult to get it working . Then there is the hassle of collecting plugins, dlls, scripts, not fun if you're new to it. But in the end, worth the hassle x10 . Very powerful video manipulation stuff that you can't do with other programs
If you still want to pursue this, did you follow the getting started instructions, install correct vapoursynth version and matching python version ?
Did you try typing this in python command line, as per the instructions ? And what messages did it print ? Was it the same unknown location ?
http://www.vapoursynth.com/doc/installation.html
Code:from vapoursynth import core print(core.version())
More filters and gewgaws are all well and fine, but if vapoursynth et al. are such hot shit then why don't they have built-in legalizers already?
And obviously, there are many other types of AV manipulations besides "legalization" .
Good luck with the flicker and GUI , I hope it works outLast edited by poisondeathray; 7th Dec 2018 at 15:41.
-
You cannot install Python and Vapoursynth properly, you don't even say what you did, portable version, or installation, very important step if Python works at all, or it is just vapoursynth, nothing, you said nothing, just copy/paste error.Some complaining, give , give me. What you want to hear? Are you serious? You were not even hold by my script at all, because you do not have it right in the first place, yet giving the lecture about correct approach in discussions?
-
That problem has finally come home to roost. I just processed some footage and it is r 103 compliant in RGB24/4:4:4, but when transcoding to the distribution format, 4:2:2, it is no longer compliant. Any ideas?
I am doing one pass to one file at 4:2:0 -> RGB, then a second pass at RGB -> 4:2:2.
Similar Threads
-
MPC-BE and integrated graphics
By Seeker01 in forum Software PlayingReplies: 4Last Post: 11th Sep 2018, 09:03 -
Minimum 4K graphics card
By kyrcy in forum ComputerReplies: 17Last Post: 14th Jan 2018, 13:04 -
Budget 4K graphics card
By kyrcy in forum ComputerReplies: 8Last Post: 24th Dec 2017, 16:04 -
New graphics card suggestion
By kyrcy in forum ComputerReplies: 9Last Post: 14th Dec 2016, 13:41 -
How much is the graphics card used in video encoding?
By justinrye in forum Newbie / General discussionsReplies: 11Last Post: 8th Nov 2014, 16:36