VideoHelp Forum




+ Reply to Thread
Page 2 of 7
FirstFirst 1 2 3 4 ... LastLast
Results 31 to 60 of 187
  1. Member
    Join Date
    Sep 2021
    Location
    United Kingdom
    Search Comp PM
    Thanks rgr, Sharc & jagabo. I glad it looks ok, Thank you for analyzing the new clip. I will try the code colorYUV(levels="PC->TV") and post it. Does this keep the levels within the range for tv's. Where would I place it in jagabo's script? Thanks.
    Quote Quote  
  2. Originally Posted by SkyBlues2021 View Post
    I will try the code colorYUV(levels="PC->TV") and post it. Does this keep the levels within the range for tv's. Where would I place it in jagabo's script? Thanks.
    You can put it at the beginning or at the end, BUT: I wasn't aware that you already applied a script to your sample. It is better to start with the unprocessed capture rather than tweaking already existing tweaks.
    Why did you all of a sudden capture full range? Maybe my suggested level correction is not even necessary if you capture limited range to begin with.
    Maybe I was missing something?
    Last edited by Sharc; 1st Jun 2023 at 14:59.
    Quote Quote  
  3. Member
    Join Date
    Sep 2021
    Location
    United Kingdom
    Search Comp PM
    Thanks Sharc. I've finished capturing my video8 tapes and have now started transferred my digital8 tapes through a firewire cable onto WinDV. Sadly I don't have a standalone proc-amp and I didn't think you could adjust the levels when capturing with a firewire cable?
    Quote Quote  
  4. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    When capturing via firewire there should be no procamp adjustments, as that would require a reencode, which is contrary to the idea of a simple straight transfer of (already encoded dv) data.

    Scott
    Quote Quote  
  5. Originally Posted by SkyBlues2021 View Post
    Thanks Sharc. I've finished capturing my video8 tapes and have now started transferred my digital8 tapes through a firewire cable onto WinDV. Sadly I don't have a standalone proc-amp and I didn't think you could adjust the levels when capturing with a firewire cable?
    I missed that you captured DV via firewire which is a digital copy, taking the proc-amp out of the equation. I don't know WinDV. If there is an option somewhere for 'limited range' ot 'TV range' I would enable it. The levels in your post#22 were fine, limited range.
    If you are going to re-encode your full-range YUV capture anyway I would adjust the luma to stay within the limited (TV) range (16 .....235). You can do this with ColorYUV(levels="PC->TV"). You would then be on the safe side for playback, as your players (including TV) may not support super blacks and super whites properly, but rather clip.

    Edit:
    If you see 4 vertical bars your player complies with full range
    Image Attached Files
    Last edited by Sharc; 2nd Jun 2023 at 08:50.
    Quote Quote  
  6. Member
    Join Date
    Sep 2021
    Location
    United Kingdom
    Search Comp PM
    Thanks Sharc & Cornucopia. In the script I've replace jagabo's color ColorYUV command with ColorYUV(levels="PC->TV") I'm guessing they both achieve the same final result? I can only see 3 vertical lines on my windows media player and VLC player, What do I need to do to get my player to comply with full range? Thanks.
    Image Attached Thumbnails Click image for larger version

Name:	pc tv Avisynth.JPG
Views:	33
Size:	146.8 KB
ID:	71450  

    Click image for larger version

Name:	Sharc chart.JPG
Views:	25
Size:	47.4 KB
ID:	71451  

    Image Attached Files
    Quote Quote  
  7. The color range is now limited ok, but you got elevated darks again and the picture looks washed out.
    Add after ColorYUV(levels=....) something like
    Code:
    SmoothLevels(input_low=24,gamma=1.0,input_high=235,output_low=0,output_high=235)
    which should fit the luma better into the 16....235 range - and look similar as jagabo's in post#10.

    Edit:
    Sorry SkyBlues2021, forget the discussion about the full range and my comments on the encodes. Your encodes are all ok. By mistake I took rgr's full range version of post#26 as yours (and wondered as to why you would have done this). May bad. Just use jagabo's script and encode, as you did. Apologies for the confusion and trouble I created
    Last edited by Sharc; 2nd Jun 2023 at 18:06.
    Quote Quote  
  8. Member
    Join Date
    Sep 2021
    Location
    United Kingdom
    Search Comp PM
    Thanks Sharc. Your allowed a moment of confusion mine never leave me! You chaps on here are fantastic giving your free time to help people. I'm very grateful for the help you all give me. I'm going to spend the day reading about ColorYUV commands to try to understand it better. I'm hoping one day I can make a decision for myself!
    Quote Quote  
  9. ColorYUV is pretty simple: off_y adds to Y, gain_y multiplies Y.

    off_y adds the specified value to Y, Y' = Y + N. So off_y=1 adds 1 to every Y value, making every pixel a little brighter. off_y=-1 subtracts 1 from every Y value, making every pixel a little darker. This is the same as Tweak(bright=N, coring=false). All pixels get darker or brighter by the same amount. A waveform monitor will show the entire graph shift up or down by N.

    gain_y multiples each Y value using the equation Y' = Y * ((256+N)/256) where N is the value you specify. If you use 0 (the default) then the equation becomes:

    Code:
    Y' = Y * ((256+0)/256)
    Y' = Y * 256 / 256
    Y' = Y * 1.0 # the same as Tweak(cont=1.0, coring=false)
    Ie, no change.

    If you enter 256 you get:

    Code:
    Y' = Y * ((256+256)/256)
    Y' = Y *  512 / 256
    Y' = Y * 2.0 # the same as Tweak(cont=2.0, coring=false)
    Every pixel becomes twice as bright.

    If you enter -128 you get:

    Code:
    Y' = Y * ((256-128)/256)
    Y' = Y * 128 / 256
    Y' = Y * 0.5 # the same as Tweak(cont=0.5, coring=false)
    Every pixel becomes half as bright.

    So basically, values over zero increase the contrast, making the picture brighter, values less than zero decrease the contrast, making the picture darker. But dark pixels are changed less than bright pixels. For example, doubling 1 becomes 2, a difference of only 1. But doubling 100 becomes 200, a difference of 100.

    cont_y is performed first, then off_y. ie:

    Code:
    Y' = Y * ((256+cont_y)/256) + off_y
    Quote Quote  
  10. Note that ColorYUV(levels="PC->TV") is the equivalent of ColorYUV(gain_y=-36, off_y=16).
    Quote Quote  
  11. Member
    Join Date
    Sep 2021
    Location
    United Kingdom
    Search Comp PM
    Thanks jagabo. I will need to read it a few times and hope it sinks in! What script should I use to get the Waveform and ColorYUV displayed so that I can try and practice adjusting the levels?
    Quote Quote  
  12. Here's a script you can play around with:

    Code:
    function GreyRamp()
    {
       black = BlankClip(color=$000000, width=1, height=256, pixel_type="RGB32", length=512)
       white = BlankClip(color=$010101, width=1, height=256, pixel_type="RGB32", length=512)
       StackHorizontal(black,white)
       StackHorizontal(last, last.RGBAdjust(rb=2, gb=2, bb=2))
       StackHorizontal(last, last.RGBAdjust(rb=4, gb=4, bb=4))
       StackHorizontal(last, last.RGBAdjust(rb=8, gb=8, bb=8))
       StackHorizontal(last, last.RGBAdjust(rb=16, gb=16, bb=16))
       StackHorizontal(last, last.RGBAdjust(rb=32, gb=32, bb=32))
       StackHorizontal(last, last.RGBAdjust(rb=64, gb=64, bb=64))
       StackHorizontal(last, last.RGBAdjust(rb=128, gb=128, bb=128))
    }
    
    
    function AnimateGain(clip vid, int gain)
    {
    	ColorYUV(vid, gain_y = gain)
    	Subtitle("gain="+String(gain))
    }
    
    
    function AnimateOff(clip vid, int off)
    {
    	ColorYUV(vid, off_y = off)
    	Subtitle("off="+String(off))
    }
    
    function AnimateCont(clip vid, int cont)
    {
    	ColorYUV(vid, cont_y = cont)
    	Subtitle("cont="+String(cont))
    }
    
    
    function AnimateGamma(clip vid, int gamma)
    {
            gamma = gamma < -255 ? -255 : gamma
    
    	ColorYUV(vid, gamma_y = gamma)
    	Subtitle("gamma="+String(gamma))
    }
    
    #ImageSource("grayramp.png", start=0, end=512, fps=23.976) 
    GreyRamp()
    ConvertToYUY2(matrix="PC.601")
    
    v0 = last
    v1=Animate(0,512, "AnimateGain", -256, 256)
    v2=Animate(0,512, "AnimateOff", -256, 256)
    v3=Animate(0,512, "AnimateCont", -256, 256)
    v4=Animate(0,512, "AnimateGamma", -256, 256)
    
    StackHorizontal(v0.Subtitle("original"), v1,v2,v3,v4)
    
    #VideoScope("bottom")
    TurnRight().Histogram().TurnLeft()
    
    ConvertToRGB(matrix="pc.601")
    Image Attached Files
    Quote Quote  
  13. Member
    Join Date
    Sep 2021
    Location
    United Kingdom
    Search Comp PM
    Thanks jagabo. That looks complicated! I will have a look at it tomorrow, it's something I really need to grasp the concept of. What does the grayramp chart mean? Thanks.
    Quote Quote  
  14. The top row is the waveform monitor for each of the blocks below. The blocks at the bottom show the result of the particular operation with ColorYUV. It shows a greyscale ramp (the source video with 256 identical scanlines, each line running from Y=0 to Y=255) along with its waveform above it. Each block to the right of that shows the effect of each of the ColorYUV luma filters at different values. When it says gain=N that means ColorYUV(gain_y=N) was applied. When it says off=N it means ColorYUV(off_y=N) was applied. Etc.
    Quote Quote  
  15. Member
    Join Date
    May 2005
    Location
    Australia-PAL Land
    Search Comp PM
    Code:
    colorYUV(levels="PC->TV")
    If I wanted to make a file for both PC and TV (IOW, the file could be played on either), can I assume that it would be better to leave this out?
    Quote Quote  
  16. Originally Posted by Alwyn View Post
    Code:
    colorYUV(levels="PC->TV")
    If I wanted to make a file for both PC and TV (IOW, the file could be played on either), can I assume that it would be better to leave this out?
    Your video files are good and safe for PC and TV when the luma (Y) in YUV color space is in the range 16....235 and encoded with the flag 'limited' (x264 default). At playback time YUV is converted to RGB, and the limited YUV range is converted to fullrange RGB. This is standard. You don't have to do anything.
    ColorYUV(levels="PC->TV") is a special case of level adjustment which you may use when you know that your video YUV source is full range (=PC range), means Y spans the range from 0 ..... 255 (and should be flagged as 'full' accordingly). This may cause clipping at playback time when the video player (your PC or TV) expects standard limited range and expands it (full to fuller, so to speak. 0....16 and 236 ....255 are thrown over board), means the darks and brights are squashed.
    ColorYUV(levels="PC->TV") will shift the Y from 0....255 to 16....235. Use it only when you know (histogram) that your video YUV source is 'full' (it shouldn't be. So yes, normally leave it out).

    When your captures do not "resaonably well" fit the limited 16....235 luma range, but extend beyond it, use Avisynth to fine tune the levels. Levels(), ColorYUV(), tweak() and similar are tools to manipulate the levels.

    So again, limited range (also called TV range) with Y 16....235 is standard for video (broadcast, DVD, Blu-ray ....) and good and standard for your PC and TV.
    Histogram("levels") and Histogram("classic") is your friend for checking.
    PC player (renderers) usually have a setting to select (force) TV or PC range playback. For video encoding however always stick to limited (Y 16 ....235). Doing anything else just calls for troubles.
    Last edited by Sharc; 4th Jun 2023 at 04:28.
    Quote Quote  
  17. Captures & Restoration lollo's Avatar
    Join Date
    Jul 2018
    Location
    Italy
    Search Comp PM
    In addition to what Sharc properly said, note that an AviSynth processing most of the time expand an input video with levels 16-235 to an output video full range 0-255 (i.e. when deinterlacing, denoising, sharpening).

    If needed in the contest explained before, add a conversion to limited range (also) at the end of the script.
    Quote Quote  
  18. Also, VirtualDub operates in RGB only as I understand. This is one more reason to keep the luma in limited YUV range.

    As the PC player/renderer settings are sometimes not very clear and flags and ranges are sometimes conflicting and criss-crossed, one may play the attached file as a test. It is YUV limited format like captures, DVDs etc. should be.
    Now use a color picker and check the squares labelled 16 and 235. If the picker returns 0,0,0 for the '16' and 255,255,255 for the '235' your player is set up correctly for viewing standard YUV videofiles.
    Image Attached Files
    Last edited by Sharc; 4th Jun 2023 at 05:38.
    Quote Quote  
  19. Originally Posted by lollo View Post
    In addition to what Sharc properly said, note that an AviSynth processing most of the time expand an input video with levels 16-235 to an output video full range 0-255 (i.e. when deinterlacing, denoising, sharpening).
    I don't know of any AviSynth filter that clandestinely converts limited range YUV to full range YUV. Minor excursions outside the 16-235 range are sometimes created as a byproduct of some filters (halos at sharp edges, for example) but that's not a levels conversion (it's the reason limited range is used in the first place -- to leave a little headroom and footroom for those excursions).

    Basically, the SkyBlue2021 should not be using ColorYUV(mode="PC->TV") because it's a fixed function filter. It brings the white level down to the correct range (235) but it also raises the already too high black level of his videos. That makes them look even more washed out. So he'll then have to run a second filter to fix the black level. Just do the entire fix at once using one of the filters discussed here already.

    The entire video distribution industry is based on limited range YUV. You can't see YUV video -- it must be converted to RGB for your eyes. All properly set up computers, TVs, and software convert that limited range YUV to full range RGB for display. For example, VirtualDub converts limited range YUV to full range RGB in it's preview window.

    By the way, originally VirtualDub did all filtering in RGB. But some VirtualDub filters can now work in either RGB or YUV. The default when using RGB filters is to automatically convert with the same limited range YUV to full range RGB rec.601 matrix that is used for the preview windows.
    Quote Quote  
  20. Member
    Join Date
    Aug 2018
    Location
    Wrocław
    Search PM
    Originally Posted by Sharc View Post
    Also, VirtualDub operates in RGB only as I understand. This is one more reason to keep the luma in limited YUV range.
    Most video (if not all) editors work in RGB. While converting an image from YUV full range to RGB (and RGB->YUV:FR) you lose quality unnoticeably, converting from RGB to YUV-limited you lose more.
    More and more equipment works in full range (YUV) -- my Sony camcorder (16-255), sport camera (0-255) or Playstation.
    When processing private videos, there is no need to limit yourself and reduce the quality of the recorded video if the original is in full-range.
    Quote Quote  
  21. Captures & Restoration lollo's Avatar
    Join Date
    Jul 2018
    Location
    Italy
    Search Comp PM
    I don't know of any AviSynth filter that clandestinely converts limited range YUV to full range YUV.
    Not clandestinely, nor with a specific command, but the effect is there. Here an example with a simple QTGMC:

    Click image for larger version

Name:	qtgmc.png
Views:	106
Size:	1.43 MB
ID:	71471

    Click image for larger version

Name:	qtgmc2.png
Views:	134
Size:	1.10 MB
ID:	71472

    My old experiment reference here: https://www.digitalfaq.com/forum/video-restore/10776-levels-before-after.html

    In addition, some filter converts internally from TV level to PC level for better processing. An example from SMDegrain (although for motion vectors only). Same for TemporalDegrain2:
    Code:
    # Converts luma (and chroma) to PC levels, and optionally allows tweaking for pumping up the darks. (for the clip to be fed to motion search only)
    # By courtesy of cretindesalpes. (http://forum.doom9.org/showthread.php?p=1548318#post1548318)
    I remeber also Shapeners having same approach, for example LimiteSharpenFaster:
    Code:
    ...
    ex=blankclip(last,width=smx,height=smy,color=$FFFFFF).addborders(2,2,2,2).coloryuv(levels="TV->PC")
     \.blur(1.3).mt_inpand().blur(1.3).bicubicresize(dest_x,dest_y,1.0,.0)
    ...

    edit: QTGMC filtering used was QTGMC(preset="slow", matchpreset="slow", matchpreset2="slow", sourcematch=3, tr1=2, tr2=1, NoiseTR=2, sharpness=0.1) to avoid intensive processing
    Last edited by lollo; 4th Jun 2023 at 09:06.
    Quote Quote  
  22. I have never seen QTGMC change levels. For example, taking the left portion of your sample image and applying QTGMC():

    Code:
    ImageSource("qtgmc.png", start=0, end=23, fps=23.976) 
    Crop(0,0,680,-0)
    ConvertToYV12(interlaced=true)
    StackHorizontal(last.Histogram(mode="levels"), last.QTGMC().Histogram(mode="levels"))
    HistogramOnBottom()
    Image
    [Attachment 71476 - Click to enlarge]


    Replacing QTGMC in that script with TemporalDegrain2, SMDegrain, or LimitedSharpenFaster resulted in the same -- no change in levels.
    Quote Quote  
  23. Interesting and surprising. Using a color picker:
    - The white letters of the original are around RGB 240,242,235 (+/- noise). The QTGMCed variant is RGB (255,255,255).
    - The bright spot in the center of the picture has 2 probably clipped RGB components (255,255,205+/-noise) in the QTGMCed variant, while the original .avi is unclipped around RGB(242,248,190), +/-noise.
    Quote Quote  
  24. Captures & Restoration lollo's Avatar
    Join Date
    Jul 2018
    Location
    Italy
    Search Comp PM
    I have never seen QTGMC change levels.
    Let me explain the flow, my previous images may mix different steps:
    0- source is YUV 16-254
    1- apply levels(16,1.0,254,16,235,coring=false,dither=true) to source -> levels are now 16-235
    2- QTGMC() or other processing
    3- levels are now 0-255

    org + levels correction
    Click image for larger version

Name:	org.png
Views:	128
Size:	790.1 KB
ID:	71477
    processed
    Click image for larger version

Name:	rest.png
Views:	116
Size:	622.6 KB
ID:	71478

    org + levels correction
    Click image for larger version

Name:	org2.png
Views:	112
Size:	639.7 KB
ID:	71479
    processed
    Click image for larger version

Name:	rest2.png
Views:	126
Size:	614.4 KB
ID:	71480

    here the video, if you want to try by yourself. My point is/was that even if I fix the levels to 16-235 prior to filtering, after the processing they are 0-255 anyhow. They may be relative to small "details" as you mentioned in your first reply, but they are there:

    ufo_sII2a_spot_amtv_2_cut.avi


    Interesting and surprising. Using a color picker:
    The previous image may mix different steps, sorry; not worth an analysis

    edit: you see Y=11 level instead of Y=16 in the picture because there are other "corrections" in the script, but the concept is the same. Here a better comparison and the (modified) script

    org + levels correction
    Image
    [Attachment 71482 - Click to enlarge]


    processed
    Image
    [Attachment 71483 - Click to enlarge]


    Code:
    # "progressive" fields
    # interlaced  fields TFF
    # TemporalDegrain2: Denoiser, Temporal filter with motion compensation
    # modeA for progressive fields
    # modeC_Nnedi3 for interlaced fields
    
    video_org=AviSource("C:\Users\giuse\Videos\Acquisizioni\ufo_sII2a_spot_amtv_2_cut.avi")
    
    video_org_trim=video_org
    
    # cropping 
    	crop_left=8	# | removing black borders on left, top and righ; removal "heads switching noise" on bottom 	
    	crop_top=2	# | 720-(8+24)x576-(2+10)=688x564
    	crop_right=24
    	crop_bottom=10
    video_org_trim_crop=video_org_trim.crop(crop_left,crop_top,-crop_right,-crop_bottom)
    
    # plugins directory
    plugins_dir="C:\Users\giuse\Documents\VideoSoft\MPEG\AviSynth\extFilters\"
    
    	# LevelsLumaOnly_modGMa2
    Import(plugins_dir + "LevelsLumaOnly_modGMa2.avsi")
    	# Levels_GMa
    Import(plugins_dir + "Levels_Gma.avsi")
    
    	# TemporalDegrain2
    Import(plugins_dir + "TemporalDegrain-v2.1.2_modGMa.avsi")
    	# RgTools
    loadPlugin(plugins_dir + "RgTools-v1.0\x86\RgTools.dll")
    	# MaskTools2
    loadPlugin(plugins_dir + "masktools2-v2.2.23\x86\masktools2.dll")
    	# MVTools
    loadPlugin(plugins_dir + "mvtools-2.7.41-with-depans20200430\x86\mvtools2.dll")
    	# FFT3DFilter
    loadPlugin(plugins_dir + "FFT3dFilter-v2.6\x86\fft3dfilter.dll")
    	# FFTW
    loadPlugin(plugins_dir + "LoadDll\LoadDll.dll")
    loadDll(plugins_dir + "fftw-3.3.5-dll32\libfftw3f-3.dll")
    	# Dfttest
    loadPlugin(plugins_dir + "dfttest-v1.9.6\msvc\x86\dfttest.dll")
    
    	# LimitedSharpen
    Import(plugins_dir + "LimitedSharpenFaster.avsi")
    
    	# Nnedi3
    loadPlugin(plugins_dir + "NNEDI3_v0_9_4_55\x86\Release_W7\nnedi3.dll")
    
    # parameters
    # ColorYUV
    	cont_y=-20
    	off_y=-9
    	off_u=5
    	off_v=-5
    # Tweak	
    	sat=1.3
    	cont=1.1
    # Levels
    	# first step
    	input_low_1=17
    	gamma_1=1.05
    	input_high_1=254
    	output_low_1=16
    	output_high_1=235
    	# second step
    	input_low_2=5
    	gamma_2=1
    	input_high_2=255
    	output_low_2=0
    	output_high_2=255
    # denoiser
    	degrainTR=3
    	postFFT=0
    	postSigma=1
    # sharpener
    	strength=250
    	overshoot=50
    
    ### levels and colors correction before filtering
    # ColorYUV used to adjust luma contrast to limit and center luma hystogram, and to correct colors
    # Tweak used to adjust luma contrast and colors saturation (hue and brightness not used)
    # LevelsLumaOnly used to adjusts luma brightness, luma contrast, and luma gamma
    noise_baseclip=video_org_trim_crop.\
    			ColorYUV(cont_y=cont_y, off_y=off_y, off_u=off_u, off_v=off_v).\
    			Tweak(sat=sat, cont=cont, coring=false, dither=true).\
    			LevelsLumaOnly(input_low=input_low_1, gamma=gamma_1, input_high=input_high_1, output_low=output_low_1, output_high=output_high_1, coring=false, dither=true)
    			# or ColorYUV(gain_y=-20, off_y=1, off_v=-5, off_u=5)
    
    ### Conditional filtering, audio is taken from source 1 (filterChainProgressive)
    ConditionalFilter(noise_baseclip,\
    filterChainProgressive(noise_baseclip, degrainTR=degrainTR, postFFT=postFFT, postSigma=postSigma, strength=strength, overshoot=overshoot),\
    filterChainInterlaced(noise_baseclip, degrainTR=degrainTR, postFFT=postFFT, postSigma=postSigma, strength=strength, overshoot=overshoot),\
    "progressive", "equals", "true", false)
    ConditionalReader("C:\Users\giuse\Videos\Acquisizioni\avs_scripts\gma\14- restore\ufo_sII2a_spot_amtv_2_cut.txt", "progressive", false)
    
    return(last.coloryuv(analyze=true))
    
    
    # filterChainProgressive modeA for progressive fields
    function filterChainProgressive(clip c, int "degrainTR", int "postFFT", int "postSigma", int "strength", int "overshoot")
    {
    ### denoising
    denoised=c #.TemporalDegrain2(degrainTR=degrainTR, postFFT=postFFT, postSigma=postSigma)
    ### sharpening
    sharpened=denoised #.LimitedSharpenFaster(strength=strength, overshoot=overshoot)
    return(sharpened)
    }
    
    # filterChainInterlaced modeC_Nnedi3 for interlaced fields
    function filterChainInterlaced(clip c, int "degrainTR", int "postFFT", int "postSigma", int "strength", int "overshoot")
    {
    ### de-interlacing
    deinterlaced=c.AssumeTFF().nnedi3(field=-2)
    ### denoising
    denoised=deinterlaced #.TemporalDegrain2(degrainTR=degrainTR, postFFT=postFFT, postSigma=postSigma)
    ### sharpening
    sharpened=denoised #.LimitedSharpenFaster(strength=strength, overshoot=overshoot)
    ### interlacing
    interlaced=sharpened.AssumeTFF().SeparateFields().SelectEvery(4,0,3).Weave()
    return(interlaced)
    }
    Last edited by lollo; 4th Jun 2023 at 11:42.
    Quote Quote  
  25. Looking at your new series of 6 pictures, I can't see a level shift of the brights. Your histograms and reported minima/maxima may just include sharpening halos introduced by the processing filters, no?
    Taking the avearge of 5x5 pixels in order to smooth differences introduced by noise, for the white letters for example:
    1 RGB(235,250,251)
    2 RGB(235,250,247)
    3 RGB(234,252,246)
    4 RGB(235,250,248)
    5 RGB(226,240,244)
    6 RGB(226,240,244)

    The minor differences are caused by the residual noise. I don't see an indication of systematic level shift by QTGMC etc. 5 and 6 are even identical.
    So I don't get it, or am I missing something....?

    Edit:
    I did some tests with QTGMC() using test clips. No change in levels, but I see the local deviations along sharp edges, caused by the sharpening of QTGMC(). Adding sharpening filters will make it worse of course. These halos will appear in the histogram() and push the min/max values.
    Last edited by Sharc; 4th Jun 2023 at 17:32.
    Quote Quote  
  26. You're showing over-sharpening halos, not a levels change. Flat white areas (like the middles of the white letters) are the same brightness in both images. Flat dark areas are the same brightness. Only the high contrast edges are different. If you subtract one image from the other you'll see all the major differences are at sharp edges.

    Image
    [Attachment 71488 - Click to enlarge]


    If you zoom in on the letters you can clearly see the halos in rest3.png (top of the first T in DOTTOR).

    Image
    [Attachment 71490 - Click to enlarge]


    Those were created by LimitedSharpenFaster. Just because there are a few brighter pixels doesn't mean you've changed the levels.
    Quote Quote  
  27. Captures & Restoration lollo's Avatar
    Join Date
    Jul 2018
    Location
    Italy
    Search Comp PM
    Looking at your new series of 6 pictures, I can't see a level shift of the brights. Your histograms and reported minima/maxima may just include sharpening halos introduced by the processing filters, no?
    You're showing over-sharpening halos, not a levels change.
    Yes, that's what the processing aims to do.

    Back to my original point in post #47: after processing the levels are expanded, then if I do not want to loose the effect of the filter I may need to shrink back the levels before the RGB conversion, otherwise the sharpening effect I want to introduce will be reduced, because clipping of 235-255 range.

    Those were created by LimitedSharpenFaster. Just because there are a few brighter pixels doesn't mean you've changed the levels.
    They are not so few, because the "Loose Maximum" in the analysis of the processed video is 255. And once more, I do not want to loose what I have introduced, even if they are not many.

    In addition, applying only TemporalDegrain2 without LSFmod the effect is there, although much much reduced (Maximum=247, Loose Maximum=235)
    Click image for larger version

Name:	td2.png
Views:	100
Size:	641.5 KB
ID:	71500

    Edit:
    I did some tests with QTGMC() using test clips. No change in levels, but I see the local deviations along sharp edges, caused by the sharpening of QTGMC(). Adding sharpening filters will make it worse of course. These halos will appear in the histogram() and push the min/max values.
    I agree. But the quality of all best filters, such as QTGMC and TemporalDegrain2, relies on some amount of sharpening at the end of their processing. For what I said before, there is a risk of loosing the "improvement" if the expanded levels are not fixed before the RGB conversion.

    Thanks to both for your constructive opinions, I revised some of the checks on my flow while providing the samples thanks to your comments
    Quote Quote  
  28. Originally Posted by SkyBlues2021 View Post
    I can only see 3 vertical lines on my windows media player and VLC player, What do I need to do to get my player to comply with full range? Thanks.
    Your second picture in post#36 indicates that your video player (or browser) ignores the full flag but plays standard YUV video in the range Y=16......235 correctly, as the luma of the 4 bars are mapped to RGB, like (order 1....4 is left to right):

    1 Y=0 -> RGB=(0,0,0)
    2 Y=16 -> RGB=(0,0,0)
    3 Y=235 -> RGB=(253,253,253) small rounding error, should be (255,255,255). Therefore you see a tiny brightness difference between bar 3 and bar 4 which you can ignore
    4 Y=255 -> RGB=(255,255,255)

    So you don't have to change anything with your player I would say.

    Edit:
    Here the same as in post#35 but flagged 'limited' (as standard YUV video shoud be):
    Image Attached Files
    Last edited by Sharc; 5th Jun 2023 at 02:43.
    Quote Quote  
  29. Originally Posted by Sharc View Post
    Originally Posted by SkyBlues2021 View Post
    I can only see 3 vertical lines on my windows media player and VLC player, What do I need to do to get my player to comply with full range? Thanks.
    Your second picture in post#36 indicates that your video player (or browser) ignores the full flag
    And if you adjust the player or the graphics card's settings to compensate for that -- all your limited range videos (pretty much everything else) will display incorrectly (washed out blacks, dull whites). And what if you use the media player built into your TV? Or some other media player device (Roku, AppleTV, GoogleTV, etc.)? Are all of them going to follow the range flag? This is why I recommend people stay away from full range encoding.
    Quote Quote  
  30. Originally Posted by lollo View Post
    Back to my original point in post #47: after processing the levels are expanded, then if I do not want to loose the effect of the filter I may need to shrink back the levels before the RGB conversion, otherwise the sharpening effect I want to introduce will be reduced, because clipping of 235-255 range.
    Your processing did not change the levels, it produced super-white artifacts. You might want to make those artifacts visible by lowering the white level but then other parts of the picture that were perfect white (Y=235) before processing will now be a light grey instead (Y=~215), making the picture a bit dull. And the rest of the picture proportionally darker. And if you do the same at the dark end your perfect blacks (Y=16) will become dark greys (Y=~32), making the picture look washed out.

    The issue here is your definition of "levels". You are using the term just to describe the range of Y values. I'm using the term to mean the theoretical full black and full white values.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!