VideoHelp Forum
+ Reply to Thread
Page 1 of 11
1 2 3 ... LastLast
Results 1 to 30 of 324
Thread
  1. I have written a C-language program which uses ffmpeg to read rgb video frames, modify the rgb values of those frames, and write them back to a file, also using ffmpeg. The program works but I am not getting the results I want.

    It seems that ffmpeg always forces the video to the range of 0 - 255 or 16 - 235, no matter the values I write using C.

    Anybody know how to stop ffmpeg from forcing the RGB levels to 0 - 255 or 16 - 235?

    Rather than out_range=full or out_range=tv I want out_range=unity which doesn't alter the levels I write in C.
    Quote Quote  
  2. Originally Posted by chris319 View Post
    I have written a C-language program which uses ffmpeg to read rgb video frames, modify the rgb values of those frames, and write them back to a file, also using ffmpeg. The program works but I am not getting the results I want.

    It seems that ffmpeg always forces the video to the range of 0 - 255 or 16 - 235, no matter the values I write using C.

    Anybody know how to stop ffmpeg from forcing the RGB levels to 0 - 255 or 16 - 235?

    Rather than out_range=full or out_range=tv I want out_range=unity which doesn't alter the levels I write in C.



    If you're starting with RGB and staying in RGB - nothing is modified , nothing is "forced" by ffmpeg .

    Input = Output

    The only thing being modified is whatever your modifications did.



    But if you're using YUV steps inbetween - then use full range in/out. That keeps your "unity" . 0-255 in YUV gets "mapped" to 0-255 in RGB . It's essentially 1:1 for the Y range . eg. Y=16,U=128,V=128 will be mapped to RGB 16,16,16 instead of RGB 0,0,0.

    So if you stared with with Y 16-235 , that will be converted RGB 16-235. And that will be converted back to YUV 16-235 if you used full range.

    If you started with YUV 32,128,128 then you get RGB 32,32,32 . Nothing is forced, range unchanged.

    Note the "full range" here refers to the equations being used, not the actual values.
    Quote Quote  
  3. I'm starting with YUV from a camcorder and wish to do a simple hard clip at RGB 5-5-5 and RGB 246-246-246.

    Later I will post the code I have and we can see what's wrong with it.

    I know hard clipping is unpopular but humor me for now.
    Quote Quote  
  4. How can you control how YUV to RGB is being done by people you deliver it to. Did you already provided some samples to them and they returned it? They might not even know why, just following their hardware (possibly obsolete).

    imho, RGB is a ghost here. How can you follow RGB "standard" if delivering YUV?

    You fix your YUV , give it legal values, give it some room and making sure you have right color space, also set flags how they want it (color space, matrix, transfer, range) and that is all you can do. You cannot control how they change it to RGB with whatever hardware etc. There is many things involved,and as was said before one stupid chroma resize, not optimal chroma resize, wrong chroma placement, might introduce an illegal value right there. Catch 22.
    Quote Quote  
  5. How can you control how YUV to RGB is being done by people you deliver it to.
    Here is the standard the deliverable must conform to:

    https://tech.ebu.ch/docs/r/r103.pdf
    Quote Quote  
  6. I should add that I have a 256-step LUT or "curve" in the C program which restricts the (8-bit) values to 5 - 246 and which I am happy with. That's the easy part.

    The hard part is getting ffmpeg not to mess with the RGB values my C-language program is writing.

    It's entirely possible that I'm doing it all wrong, but the end goal remains EBU R-103.
    Quote Quote  
  7. Here is the code I am using to read and write the video frames. (I have added my own C-language code to modify the video levels).

    https://batchloaf.wordpress.com/2017/02/12/a-simple-way-to-read-and-write-audio-and-vi...-part-2-video/

    Code:
    //
    // Video processing example using FFmpeg
    // Written by Ted Burke - last updated 12-2-2017
    //
     
    #include <stdio.h>
     
    // Video resolution
    #define W 1280
    #define H 720
     
    // Allocate a buffer to store one frame
    unsigned char frame[H][W][3] = {0};
     
    void main()
    {
        int x, y, count;
         
        // Open an input pipe from ffmpeg and an output pipe to a second instance of ffmpeg
        FILE *pipein = popen("ffmpeg -i teapot.mp4 -f image2pipe -vcodec rawvideo -pix_fmt rgb24 -", "r");
        FILE *pipeout = popen("ffmpeg -y -f rawvideo -vcodec rawvideo -pix_fmt rgb24 -s 1280x720 -r 25 -i - -f mp4 -q:v 5 -an -vcodec mpeg4 output.mp4", "w");
         
        // Process video frames
        while(1)
        {
            // Read a frame from the input pipe into the buffer
            count = fread(frame, 1, H*W*3, pipein);
             
            // If we didn't get a frame of video, we're probably at the end
            if (count != H*W*3) break;
             
            // Process this frame
            for (y=0 ; y<H ; ++y) for (x=0 ; x<W ; ++x)
            {
                // Invert each colour component in every pixel
                frame[y][x][0] = 255 - frame[y][x][0]; // red
                frame[y][x][1] = 255 - frame[y][x][1]; // green
                frame[y][x][2] = 255 - frame[y][x][2]; // blue
            }
             
            // Write this frame to the output pipe
            fwrite(frame, 1, H*W*3, pipeout);
        }
         
        // Flush and close input and output pipes
        fflush(pipein);
        pclose(pipein);
        fflush(pipeout);
        pclose(pipeout);
    Am I correct in thinking this code, given a YUV input video, would convert it to RGB in the frame buffer?
    Quote Quote  
  8. Originally Posted by chris319 View Post

    Am I correct in thinking this code, given a YUV input video, would convert it to RGB in the frame buffer?
    Code:
     FILE *pipein = popen("ffmpeg -i teapot.mp4 -f image2pipe -vcodec rawvideo -pix_fmt rgb24 -", "r");
    This is converting teapot.mp4 to RGB24, a limited range 601 conversion prior to the input pipe

    -vcodec rawvideo would give you raw YUV if the input video was YUV .

    -pix_fmt rgb24 converts to RGB24 using limited range 601


    ("limited range" here in this context refers to the equations used; so if you had a pixel in the input video of Y=16,U=128,V=128, that would give RGB 0,0,0)
    Quote Quote  
  9. pix_fmt rgb24 converts to RGB24 using limited range 601
    Aren't 601 and 709 YUV by definition? How can there be 601 or 709 RGB?

    I understand what you mean by conversion to limited-range, but then we have ffmpeg altering the levels by converting to 16 - 235.
    Quote Quote  
  10. Your input video is limited YUV. If not specifying (within ffmpeg somehow) you convert from limited YUV to full RGB. Here you try to fix it I guess.
    Then getting YUV back you are getting original 16-235 for YUV, limited. Wanting to get 0-255, you'd change your video.

    Then you might ask, what interpretation should I fix, limited or full?
    Think of RGB as a ghost, because it is not your video, it is a ghost on screen, where what you can see can be tweaked as you wish it to be (setting range, matrix and other conversion parameters, conversion algorithms perhaps as well).

    There are filters out there, apps that use RGB for calculations, but then one needs to bring it back to YUV. And that causes problems. Not sure why you'd need to go thru that to somehow make video in legal values.
    Quote Quote  
  11. Originally Posted by chris319 View Post
    pix_fmt rgb24 converts to RGB24 using limited range 601
    Aren't 601 and 709 YUV by definition? How can there be 601 or 709 RGB?

    I understand what you mean by conversion to limited-range, but then we have ffmpeg altering the levels by converting to 16 - 235.

    FFmpeg does whatever you tell it to do


    In this context 601 vs. 709 refers to the matrix used for YUV<=>RGB conversion

    You can apply 601 or 709 matrix , and full or limited variants, and in either direction. So that's 4 combinations for each direction. (601 full and limited, 709 full and limited). In this context they refer to the equations, not the actual values that you have. eg. If you have limited range data, you can apply any of the 4 to it.

    (There are many more combinations and variations in Europe, but for ATSC 3.0, this year they are finally upgrading supposedly for Rec2020, UHD, HLG, PQ , tone mapping; not just a few test stations. Everything is way more complex, USA is far behind in this area)
    Quote Quote  
  12. Your input video is limited YUV. If not specifying (within ffmpeg somehow) you convert from limited YUV to full RGB. Here you try to fix it I guess.
    Then getting YUV back you are getting original 16-235 for YUV, limited. Wanting to get 0-255, you'd change your video.
    The first thing I need to do is to check the YUV levels coming out of the camcorder. I THINK they are 0 - 255 but they need to be checked without conversion to RGB, then checked after conversion to RGB.

    Wanting to get 0-255, you'd change your video.
    I don't know where you get that notion. The range I'm after is RGB 5 - 246 per EBU R103 which I posted. Did you read the spec?
    Quote Quote  
  13. I think your video is most likely YUV with illegal values. If you see a value under 16 or above 235,245, that does not mean video is 0-255.
    Quote Quote  
  14. Originally Posted by _Al_ View Post
    I think your video is most likely YUV with illegal values. If you see a value under 16 or above 235,245, that does not mean video is 0-255.
    When I examine the raw video, that will give me a definite answer, not what anyone "thinks" it is.
    Quote Quote  
  15. Oh I see,
    good luck
    Quote Quote  
  16. The levels I'm getting directly out of the camcorder are:

    Maximum lum (YUV) = 255
    Minimum lum (YUV) = 3

    Maximum R, G, B = 255
    Minimum R, G, B = 0

    However, MediaInfo reports "Color Range: Limited"
    Last edited by chris319; 12th Feb 2020 at 06:30.
    Quote Quote  
  17. I dusted off some old code which seems to do what I want, so far:

    Code:
    D:\Programs\ffmpeg\BroadcastVideo\ffmpeg -y  -i "short.mp4"  -c:v mpeg2video  -pix_fmt yuv422p  -vb 50M  -minrate 50M  -maxrate 50M  -vf colorlevels=romin=0.09:gomin=0.09:bomin=0.09:romax=0.83:gomax=0.83:bomax=0.83,scale=out_color_matrix=bt709:out_range=limited  -color_primaries bt709  -color_trc bt709  -colorspace bt709 -an  -f vob  clipped.mpg
    Quote Quote  
  18. Originally Posted by chris319 View Post
    The levels I'm getting directly out of the camcorder are:

    Maximum lum (YUV) = 255
    Minimum lum (YUV) = 3

    Maximum R, G, B = 255
    Minimum R, G, B = 0
    min/max RGB are not applicable out of the camera for your case, because it's recording YUV to the media. RGB values would depend on how you convert to RGB, and there are dozens of different ways and variations on chroma upsampling methods



    However, MediaInfo reports "Color Range: Limited"
    mediainfo does not report the actual levels, just what the video was encoded as or tagged as




    Originally Posted by chris319 View Post
    I dusted off some old code which seems to do what I want, so far:

    Code:
    D:\Programs\ffmpeg\BroadcastVideo\ffmpeg -y  -i "short.mp4"  -c:v mpeg2video  -pix_fmt yuv422p  -vb 50M  -minrate 50M  -maxrate 50M  -vf colorlevels=romin=0.09:gomin=0.09:bomin=0.09:romax=0.83:gomax=0.83:bomax=0.83,scale=out_color_matrix=bt709:out_range=limited  -color_primaries bt709  -color_trc bt709  -colorspace bt709 -an  -f vob  clipped.mpg


    If you apply -vf colorlevels like that, you will get a limited range 601 conversion to RGB, then limited 709 conversion out to YUV with "scale=out_color_matrix=bt709ut_range=limited" . So you would expect colors to shift even if you didn't clip anything . And you don't want to clip that much. Contrast, black and white level will become way off- because you're using a limited range conversion to RGB, but clipping RGB values as if you did a full range conversion.

    The RGB 16-235 requirement for r103 is when full range equation is applied, not limited range equation. The full range equations are being used with the broadcast checker for the R,G,B channel check - so when you submit Y 16-235 , you get RGB 16-235 or "studio level RGB"; not RGB 0-255 as you normally would in a computer setting ("computer RGB"). If you're using some custom scope, it needs settings or toggles reflect this.

    Do you recall the colorbars discussion? The concepts are very related here, you might want to revisit it. Run some (valid) colorbars through that. Recall the actual YUV values are given in the ITU document. If your output YUV values are way off , that indicates a problem with the process. Check the pluge, the white and black level as you would in a studio setting as well.

    If you are using a RGB intermediate converted using a full range equation for filtering/processing, ignore what you "see" on a typical display, unless you have 2 displays. One setup as a computer RGB 0-255 for the program display, one as RGB 16-235 for the preview display



    If you convert to RGB using full range, then clip to RGB 5-246, where the black and white points are still at RGB 16 and 235 (not stray pixels, but the actual black and white points - because you adjusted it to 16 and 235 in YUV with your eyes prior using a waveform) - by definition, that makes everything legal. Not 99.9%, but 100.0%. All out of gamut errors and negative RGB values were already "culled" by that prior YUV => RGB conversion.

    Converting from RGB back to YUV using full range you still get Y 16-235 black to white, and under/overshoots with clipping at 5-246 when you use the full range equations - that is if you work in YUV 4:4:4.

    *However - It's that subsampling step to YUV 4:2:2 that can generate invalid values. You can generate values such as "zero" in one channel from the subsampling when it's converted back to RGB - it's easy to demonstrate and prove this. It depends on what combinations of pixels you have adjacent to each other, also which chroma sampling algorithms are used, but no algorithm is immune. You will almost always generate some illegal pixel values, even if you clipped the over/undershoots in the RGB stage to strict 16-235. That's what the % "leeway" is for in submission specs and broadcast checkers. Some people apply filtering to each channel, such as low pass filtering , basically blur everything to reduce the % of illegal values
    Quote Quote  
  19. PDR: can you please post some sample code? There is so much to take in in your posts that I'm frankly lost.
    Thank you.

    BTW I am not working in 4:4:4. It comes out of the camcorder as 4:2:0. The output can be 4:2:0 or 4:2:2; either one is acceptable.

    Yes, the code I "dusted off" with colorlevels=romin=0.09 etc. is giving me color shifts by actual test, so that's no good.

    However, if I omit the romin, etc., the colors are accurate by actual test.
    Last edited by chris319; 12th Feb 2020 at 13:24.
    Quote Quote  
  20. Is this what chris319 wants to do?
    Code:
    import vapoursynth as vs
    from vapoursynth import core
    import numpy as np
    
    source_path=r'G:/test_file.mp4'
    clip = core.lsmas.LibavSMASHSource(source_path)
    
    #trimming for testing
    #clip = clip[3000:3200]
    
    MIN = 5
    MAX = 245
    
    rgb_clip = core.resize.Point(clip, matrix_in_s = '709', format = vs.RGB24)
    
    def restrict_rgb_frame(n,f):
        #this converts vapoursynth rgb frame to numpy array 
        np_image = np.dstack([np.array(f.get_read_array(i), copy=False) for i in range(3)])
    
        #this restricts values in numpy array image
        clipped_np_image = np.clip(np_image, a_min = MIN, a_max = MAX)
    
        #this changes numpy array type back to vapoursynth VideoFrame type
        vs_frame = f.copy()
        [np.copyto(np.asarray(vs_frame.get_write_array(i)), clipped_np_image[:, :, i]) for i in range(3)]
    
        return vs_frame
    
    
    clipped_rgb = core.std.ModifyFrame(rgb_clip, rgb_clip, restrict_rgb_frame)
    clipped_yuv = core.resize.Point(clipped_rgb, matrix_s='709', format = vs.YUV420P8)
    
    clip.set_output(0)
    clipped_rgb.set_output(1)
    clipped_yuv.set_output(2)  #to actually encode for studio
    Last edited by _Al_; 12th Feb 2020 at 14:32.
    Quote Quote  
  21. or to use vapoursynts lut:
    Code:
    import vapoursynth as vs
    from vapoursynth import core
    
    source_path=r'G:/test_file.mp4'
    clip = core.lsmas.LibavSMASHSource(source_path)
    
    #trimming for testing
    #clip = clip[3000:3200]
    
    MIN = 5
    MAX = 245
    
    rgb_clip = core.resize.Point(clip, matrix_in_s = '709', format = vs.RGB24)
    
    def restrict(x):
       return max(min(x, MAX), MIN)
    
    clipped_rgb = core.std.Lut(clip=rgb_clip, planes=[0, 1, 2], function=restrict)
    clipped_yuv = core.resize.Point(clipped_rgb, matrix_s='709', format = vs.YUV420P8)
    
    clip.set_output(0)
    clipped_rgb.set_output(1)
    clipped_yuv.set_output(2)
    Quote Quote  
  22. I can't remember if vapoursynth is one of several programs I could never get to work on Windows 10. We've tried several such programs and none of the advice I have been given to make them work, has worked.

    I'll give vapoursynth another shot if you're convinced it will work

    The MAX value should be 246, BTW.
    Quote Quote  
  23. Latest version of Vapoursynth needs Python 3.7, it needs to be closely watched what version it needs (on windows)
    I recommend to install Python 3.7 and then installing latest Vapoursynth. Again, tomorrow it could be different what version of Vapoursynth needs what version of Python.

    I made it work portable, portable Vapoursynth and portable python both dumped in one directory, but I do not recommend that if you are starting with all of this, because sys.paths needs to be set in scripts for python to load modules, also dll paths need to be loaded first if your scripts are anywhere in PC.
    Quote Quote  
  24. Thanks for the input. I'll be back to ask questions which I'm sure I'll have.

    PDR brought up a good point. All of this has to be tested on color bars to make sure there are no color shifts.

    The colors I use for testing are in the RGB range of 16 - 180 so they shouldn't clip.
    Quote Quote  
  25. to get color bars to vapoursynth I use colorbars.dll ,pasting it into Vapoursynths plugins64 directory or plugins , depending how it is named, info, example:
    Code:
    import vapoursynth as vs
    from vapoursynth import core
    c = core.colorbars.ColorBars(format=vs.YUV444P10)
    c = core.std.SetFrameProp(clip=c, prop="_FieldBased", intval=0) 
    c = core.std.Convolution(c,mode="h",matrix=[1,2,4,2,1])
    #c = core.resize.Point(clip=c,format=vs.YUV420P8)
    c = c * (30 * 30000 // 1001)
    clip = core.std.AssumeFPS(clip=c, fpsnum=30000, fpsden=1001)
    clip.set_output()
    to get YUV444 10 bit color bars, or it could be changed to any format
    Quote Quote  
  26. I have an .mp4 file of bars which was encoded in BT.709 which I will use as the input file.

    What is the output file name?
    Last edited by chris319; 12th Feb 2020 at 17:28.
    Quote Quote  
  27. None yet, clipped_yuv is just python/vapoursynth object, you have to request a frame and then pipe frames to encoder within script itself, but that might not be intuitive for now. Better use vspipe.exe (comes with vapoursynth) and use it in windows command prompt to get video. vspipe.exe handles piping frames from your script to your encoder of your choice (that accepts stdin), example:

    Code:
    "vspipe.exe" --outputindex 2 --y4m "your_script.vpy" - | "x264.exe" --frames  35765  --demuxer y4m --crf 18  --vbv-bufsize 30000 --vbv-maxrate 28000 --colorprim bt709 --transfer bt709 --colormatrix bt709 --output my_output.264 -
    where that --outputindex 2 means it will encode video that was outputed by this line in script: clipped_yuv.set_output(2) ,
    also that part --frames 35765 is not needed because info is in that y4m header but x264 would not show you progress in % then,
    also better use whole paths for vspipe.exe and x264.exe and your script

    or you can use ffmpeg for encoding

    to control live how is your output doing before encoding or evaluate it , making sure there are no errors, you get vapoursynth editor and write your script in there , it has a button for instant preview or just evaluate script, also encoding (but that might not be straight forward also for now). But beware what you see, because this handles other YUV to RGB conversion if looking at YUV output, you have to make sure what it does after getting to know settings and software. To get preview on screen, you need to put output as zero index in your script, like to view clipped_yuv clip, you need to write a line: clipped_yuv.set_output() or clipped_yuv.set_output(0). I think it still does not support more outputs for preview.
    Last edited by _Al_; 12th Feb 2020 at 18:18.
    Quote Quote  
  28. A simple AviSynth script to eliminate Y<5 and Y>245:

    Code:
    ColorYUV(off_y=-5).ColorYUV(off_y=5)
    ColorYUV(off_y=10).ColorYUV(off_y=-10)
    All other Y values, and all UV values will be unchanged.
    Quote Quote  
  29. op wants to limit RGB even if returning YUV
    Quote Quote  
  30. You can do the same with RGB:

    Code:
    RGBAdjust(rb=-5, gb=-5, bb=-5).RGBAdjust(rb=5, gb=5, bb=5)
    RGBAdjust(rb=10, gb=10, bb=10).RGBAdjust(rb=-10, gb=-10, bb=-10)
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!