VideoHelp Forum




+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 50
  1. What is the best way to adjust the levels of H.264 video from an .mp4 file?

    The video will be coming from a Canon Vixia HF R800.

    I want to confine the video levels to 16 - 235 and maybe correct the gamma.

    What (preferably free) tool is best for this? Avisynth? VirtualDub? ffmpeg? Adobe something?

    Thank you.
    Quote Quote  
  2. I use AviSynth for that kind of work. Can you post a short sample as well? It may not be a good idea to just clip the levels.
    Quote Quote  
  3. Would avisynth simply clip it or would it apply gain and setup? I need the latter.
    Quote Quote  
  4. AviSynth will do whatever you tell it to. For example, ColorYUV(levels="PC->TV") will compress Y from full range 0-255 to limited range 16-235. ColorYUV will not clip the ranges unless specified with opt="coring".

    http://avisynth.org.ru/docs/english/corefilters/coloryuv.htm
    Last edited by jagabo; 15th Sep 2017 at 22:05.
    Quote Quote  
  5. Thanks, jagabo.

    I've been working for decades with broadcast video equipment, but this non-broadcast stuff is new to me.
    Quote Quote  
  6. This is straight out of the camera:

    testvideo.mp4
    Quote Quote  
  7. Originally Posted by chris319 View Post
    This is straight out of the camera:

    Image
    [Attachment 43189 - Click to enlarge]

    It's not straight of the camera, it's been re-encoded by x264

    Code:
    Video
    ID                                       : 1
    Format                                   : AVC
    Format/Info                              : Advanced Video Codec
    Format profile                           : High 4:4:4 Predictive@L3.1
    Format settings, CABAC                   : Yes
    Format settings, RefFrames               : 4 frames
    Codec ID                                 : avc1
    Codec ID/Info                            : Advanced Video Coding
    Duration                                 : 10 s 344 ms
    Bit rate                                 : 106 kb/s
    Width                                    : 1 280 pixels
    Height                                   : 720 pixels
    Display aspect ratio                     : 16:9
    Frame rate mode                          : Constant
    Frame rate                               : 29.970 (30000/1001) FPS
    Color space                              : YUV
    Chroma subsampling                       : 4:4:4
    Bit depth                                : 8 bits
    Scan type                                : Progressive
    Bits/(Pixel*Frame)                       : 0.004
    Stream size                              : 133 KiB (38%)
    Writing library                          : x264 core 148 r2762 90a61ec
    Encoding settings                        : cabac=1 / ref=1 / deblock=1:0:0 / analyse=0x3:0x113 / me=hex / subme=2 / psy=1 / psy_rd=1.00:0.00 / mixed_ref=0 / me_range=16 / chroma_me=1 / trellis=0 / 8x8dct=1 / cqm=0 / deadzone=21,11 / fast_pskip=1 / chroma_qp_offset=6 / threads=12 / lookahead_threads=4 / sliced_threads=0 / nr=0 / decimate=1 / interlaced=0 / bluray_compat=0 / constrained_intra=0 / bframes=3 / b_pyramid=2 / b_adapt=1 / b_bias=0 / direct=1 / weightb=1 / open_gop=0 / weightp=1 / keyint=250 / keyint_min=25 / scenecut=40 / intra_refresh=0 / rc_lookahead=10 / rc=crf / mbtree=1 / crf=23.0 / qcomp=0.60 / qpmin=0 / qpmax=69 / qpstep=4 / vbv_maxrate=2500 / vbv_bufsize=2500 / crf_max=0.0 / nal_hrd=none / filler=0 / ip_ratio=1.40 / aq=1:1.00
    Color range                              : Full
    Color primaries                          : BT.709
    Transfer characteristics                 : BT.709
    Matrix coefficients                      : BT.709
    Quote Quote  
  8. eg. If you wanted to stream copy a 5 second clip

    -t is duration in hours:minuteseconds.ms notation
    -ss is start time (if you don't enter it, it starts at the beginning)
    -an means no audio
    -c:v copy means copy video stream

    Code:
    ffmpeg -i INPUT.mp4 -c:v copy -an -t 00:00:05 OUTPUT.mp4
    Quote Quote  
  9. The other file had been downloaded and saved to HD with no processing or encoding.

    This is literally straight out of the camera's USB port:

    MVI_0003[1].MP4
    Quote Quote  
  10. Originally Posted by chris319 View Post
    The other file had been downloaded and saved to HD with no processing or encoding.

    This is literally straight out of the camera's USB port:

    Image
    [Attachment 43190 - Click to enlarge]

    Is this the same camera you were referring to at doom9 ? Because it's normal range data , properly flagged as normal range . ie. there is nothing you have to do special to it
    Quote Quote  
  11. Originally Posted by chris319 View Post

    This is a different video; this is a video of OBS recording. It's limited range, but flagged full range and has been upsampled to 4:4:4 .
    Quote Quote  
  12. Originally Posted by poisondeathray View Post
    Originally Posted by chris319 View Post

    This is a different video; this is a video of OBS recording. It's limited range, but flagged full range and has been upsampled to 4:4:4 .
    Yes, it is. Have a look here:
    cameraout.mp4

    Here is the code used to generate it:

    Code:
    bin\ffmpeg -i MVI_0003.mp4 -y -vf lutyuv=y='clip(val*0.8588235294117647+16,1,254)' cameraout.mp4
    Quote Quote  
  13. Originally Posted by chris319 View Post
    Originally Posted by poisondeathray View Post
    Originally Posted by chris319 View Post

    This is a different video; this is a video of OBS recording. It's limited range, but flagged full range and has been upsampled to 4:4:4 .
    Yes, it is. Have a look here:
    Image
    [Attachment 43193 - Click to enlarge]

    This one has been clamped . I would say "incorrectly", but it might have been intended.

    But "black" isn't "black" anymore . For example if you look at the dark part of the keyboard, it's RGB 16,16,16, but YUV 30,128,128

    I think something is "off" with your viewing method or monitor calibration, or graphic card settings/drivers
    Quote Quote  
  14. I think something is "off" with your viewing method or monitor calibration, or graphic card settings/drivers
    There is no eyeballing going on here. It is supposed to be 16,16,16. Studio swing is 16 - 235. 16 is the lower boundary and 235 is the upper boundary. 16 is considered black in the studio swing range. If you don't see any values < 16 or > 235 then it's doing its job.

    The values 0 and 255 are reserved for sync.
    Quote Quote  
  15. Your source video in post #9 already has the right levels (though there are no brights). You've screwed them up with your conversion in post #13.

    waveform monitor, source on left, filtered on right:
    Click image for larger version

Name:	source.filtered.jpg
Views:	1037
Size:	95.1 KB
ID:	43194
    Quote Quote  
  16. Originally Posted by chris319 View Post
    I think something is "off" with your viewing method or monitor calibration, or graphic card settings/drivers
    There is no eyeballing going on here. It is supposed to be 16,16,16. Studio swing is 16 - 235. 16 is the lower boundary and 235 is the upper boundary. 16 is considered black in the studio swing range. If you don't see any values < 16 or > 235 then it's doing its job.

    The values 0 and 255 are reserved for sync.

    Do you see the waveform in jagabo's post ? The clamp is unnecessary. The original was already within legal limits

    In that waveform (it's called by histogram() in avisynth), the black region is 16-235. The "brownish/yellow" bars are the "illegal" 0-15, and 236-255 range

    You're just reducing the contrast. Black is no longer black, it's been elevated to about 30.




    Studio swing is often misused term. People get confused about RGB and YUV

    All studio swing refers to the data is within 16-235 as opposed to full-swing or full range, where data lies 0-255.

    For example, a studio swing conversion would "map" Y 16-235 to RGB (16,16,16 - 235,235,235)

    So for you , that Y=30 "black" if you used a studio swing RGB conversion would map to 30,30,30 in RGB, which is "elevated"

    16-235 Y is what you should be aiming for. That corresponds to 0-100 IRE for broadcast
    Quote Quote  
  17. And correct RGB levels for display are full range, 0 to 255.
    Quote Quote  
  18. When I cap the lens, the blacks go down to 0 and it is thus out of the 16 - 235 range. You can't get blacker than a capped lens. That's how we professional video engineers in Hollywood do it. There is some light on the surface behind the printout on my desk so it is not a valid black reference. A capped lens is.

    I have written a waveform-monitor program but it only works on a live video feed, but not on recorded video. It is actually a time (pixel-by-pixel) vs luma graph, not a true histogram by the definition of the word.

    I would like to see avisynth's, if only I could get avisynth to work with an .mp4 file, a challenge in itself.
    Last edited by chris319; 24th Sep 2017 at 23:59.
    Quote Quote  
  19. Originally Posted by jagabo View Post
    And correct RGB levels for display are full range, 0 to 255.

    On a "computer", yes, definitely RGB should be 0,0,0 - 255,255,255 as black to white


    (But some editing stations use "studio RGB", with everything calibrated to RGB 16,16,16-235,235,235 as black to white. That includes the display.

    But it doesn't really matter. The only thing that really matters is Y'16-235 , CbCr 16-240 . Black is always at Y=16. If you are using a studio RGB setup, then the scopes, waveform monitor etc, have a different calibration setting. ie. 0 IRE is 0,0,0 using "computer" RGB, but 16,16,16 using "studio" RGB. I would say 0,0,0-255,255,255 is way more common in post production these days.)





    Originally Posted by chris319 View Post
    Don't show me a poorly-photographed scene on my desk and tell me it's within spec.

    When I cap the lens, the blacks go down to 0 and it is thus out of the 16 - 235 range. You can't get blacker than a capped lens. That's how we professional video engineers in Hollywood do it. There is some light on the surface behind the printout on my desk so it is not a valid black reference. A capped lens is.
    That's nice and all, but that's not what is being said

    All that waveform is showing is that particular original video you posted has data in the valid range above Y>16 or 0 IRE. BUT Nothing lies below (or at least nothing significant). So there would be no logical reason to clamp, and especially not "blindly" . You're just making the black level "elevated". Black is no longer "black". Ok it's not a true reference test, but why are you making it worse? That's all. You could argue maybe you want to boost shadows, do a little grading but a "blind" clamp is the wrong way to do it, also because you're reducing the midtones

    Go ahead and put the lens cap on.



    If it's difficult to read the avisynth waveform (there are no "ticks" or values) , here is a more conventional waveform with IRE 0-100 values on the left and Y 16-235 on the right
    Image
    [Attachment 43196 - Click to enlarge]





    I would like to see avisynth's, if only I could get avisynth to work with an .mp4 file, a challenge in itself.

    If you cant get ffms2 to work from the doom9 thread , try l-smash.
    Quote Quote  
  20. Where did that second waveform monitor come from?

    We're not grading here; we're not making aesthetic judgements. We're simply keeping the equipment within spec.

    maybe you want to boost shadows, do a little grading
    If the user wants to apply aesthetic judgement and make the blacks lighter or darker, whatever, that's a separate process and should be done separately. So yes, keeping the hardware in spec will be done blindly. The user can then do grading and make adjustments accordingly.

    I have an old Macbook running Snow Leopard. I may yet spring for a copy of Scopebox for it. We'll see.
    Quote Quote  
  21. Originally Posted by chris319 View Post
    Where did that second waveform monitor come from?
    That was in premiere pro . Some people have difficulty "reading" the avisynth version of the "waveform" , and there are no "ticks" . The original avisynth one is actual turned on it's side and on the right side of the frame (not the bottom). Jagabo turned it so it's more conventional where "black" is on the bottom, "white" is on the top

    We're not grading here; we're not making aesthetic judgements. We're simply keeping the equipment within spec.
    If the user wants to apply aesthetic judgement and make the blacks lighter or darker, whatever, that's a separate process and should be done separately. So yes, keeping the hardware in spec will be done blindly. The user can then do grading and make adjustments accordingly.
    Exactly, so do you see why makes no sense to apply the LUT or make that clamp adjustment ? No matter what system you are using, Y is already near 16 to begin with. It doesn't make sense to elevate that to ~30 or to compress the midtones using a blind clamp. Are you just going to blindly clamp everything ? I'd strongly advise against it - You're making things worse for people downstream who have to fix your mistakes. Not very professional way of doing it and you're going to tick a lot of people off



    Avisynth (and avspmod) is nice, because you have these scopes (other ones too, RGB histograms, vectorscopes etc...) , you can make adjustments and "see" both the result and the scopes/readings . Also YUV/RGB color picker. It might not be as "glamorous" as some of the professional editing or grading software but it's free
    Quote Quote  
  22. I'm shooting a brightly-sunlit outdoor scene and there is a heavily-shaded dark area in the picture. The camera will put that dark area at or near digital zero, below 0 IRE (I've had it happen). Now it's out of spec. So there's the mistake.

    Ideally the camera would put it at 16 (0 IRE) but that's not an option with this camera. So I'm doing with ffmpeg what the camera should do in firmware. The user can always bring it down if he wants darker darks. That is very commonly done in professional video.

    A scope without a graticule is pretty useless.
    Quote Quote  
  23. Originally Posted by chris319 View Post
    I'm shooting a brightly-sunlit outdoor scene and there is a heavily-shaded dark area in the picture. The camera will put that dark area at or near digital zero, below 0 IRE (I've had it happen). Now it's out of spec. So there's the mistake.

    Ideally the camera would put it at 16 (0 IRE) but that's not an option with this camera. So I'm doing with ffmpeg what the camera should do in firmware. The user can always bring it down if he wants darker darks. That is very commonly done in professional video.


    I understand why you think you're doing it, and that your camera has limited controls. But you're not doing this in hardware or firmware - the signal has already been recorded. The end user is just going to have to adjust it afterwards anyways. You're not "saving" anything that isn't already there

    The more times you make adjustments on footage, the more it turns to mush. It's worse with compressed formats and 8bit . So this means more generation losses (huge filesizes if you use a lossless format.) Nobody is going to want that .

    The one case where you could make a semi valid argument is if a camera model shoots full range (full range data, full range flag, decoded as full range). Your camera does not.
    Quote Quote  
  24. Originally Posted by chris319 View Post
    I'm shooting a brightly-sunlit outdoor scene and there is a heavily-shaded dark area in the picture. The camera will put that dark area at or near digital zero, below 0 IRE (I've had it happen).
    So why did you provide a sample that doesn't show that?
    Quote Quote  
  25. if a camera model shoots full range (full range data, full range flag, decoded as full range). Your camera does not.
    Where do you get that it doesn't? Didn't we get this from MediaInfo?
    Color range : Full
    You're right about generational loss.
    Quote Quote  
  26. Straight out of the camera (post 9)
    https://forum.videohelp.com/threads/385078-Adjusting-H-264-Levels-from-mp4#post2496994

    MVI_0003[1].MP4

    Code:
    Color range                              : Limited
    Color primaries                          : BT.709
    Transfer characteristics                 : BT.709
    Matrix coefficients                      : BT.709
    Quote Quote  
  27. Please look at this clip on your scopes and tell me the level, specifying digital value or IRE units. Also, what other info can you glean from it?

    Lens cap raw.MP4
    Quote Quote  
  28. Black level is a little above Y=16. The lowest Y value I saw is 15, the highest 28. Most pixels are between 18 and 22.

    Code:
    LSmashVideoSource("Lens cap raw.MP4") 
    HistogramOnBottom()
    Click image for larger version

Name:	wf.jpg
Views:	1007
Size:	35.9 KB
ID:	43199

    Code:
    LSmashVideoSource("Lens cap raw.MP4") 
    ColorYUV(analyze=true)
    Click image for larger version

Name:	stats.jpg
Views:	1052
Size:	45.7 KB
ID:	43200
    Last edited by jagabo; 25th Sep 2017 at 12:47.
    Quote Quote  
  29. Really! I'm seeing zero on the HDMI output.

    How about this one?
    Partial Cap Raw.MP4
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!