VideoHelp Forum
+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 42
Thread
  1. I'm sharing some directX pixel shader (.hlsl) control code I've written recently: https://github.com/butterw/bShaders
    They are tested in mpc-hc on intel gpu (integrated graphics) and should be easy to use and modify.

    In Mpc-hc, you need to install them in the Shaders subfolder (you need write permission on the source folder to be able to modify them).
    Modifying the source file is necessary to change parameter values (auto-compilation triggers automatically after saving). For this, it's best to use a text editor with C syntaxic coloring option like Notepad++. You can create multiple versions of the same shader based on your prefered parameter values/use cases and switch between them as required.
    With Mpc-hc, you can save Shader Presets (a pre/post resize Shader chain), that you can access directly by Right Click>Shaders. Once set, shaders stay active until you disable them, so creating an empty "Disable" Shader Profile is useful.

    So Far:
    - barMask (Custom Mask, Aspect Ratio Mask, + Shift) v1.2:
    no programming required: select mode and adjust parameters in #define

    - MaskCrop: custom mask (Top, Bottom, Left, Right) the sides of the screen and recenter image, you could use this to mask a logo at the bottom of the screen for instance (mpc-hc doesn't have a built-in crop feature).
    - MaskBox: a rectangular masking box.
    - RatioLetterbox: View a custom Aspect Ratio. Simulate a 21/9 display on a standard monitor.
    - OffsetPillarbox: view video as a vertical aspect ratio on a standard landscape monitor with borders on the side (ex: 9/16, 1/1, etc.). You can offset the window to compensate for a non-centered video.
    - Image Shift.
    The borders are image zones, which means you can apply any effect on them (not just colored borders).

    - bStrobe (repeating Effect with configurable timings) v1.0
    Ready to use timed color frame Effect.
    Parameters: ScreenColor, Time of first run (Tfirst), Duration (Td), Repetition period (Tcycle), End (Number of runs, End time or No End).


    More shaders at https://gist.github.com/butterw/
    - bSide (ZoneShader_mpc) v1.0
    Side by Side and Box display of Effects

    - bTimeEffect_DePixelate
    a Pixelation-DePixelation Effect/glitch.


    An example with SBS (side-by-side) 3D input: https://forum.videohelp.com/showthread.php?t=398955
    Last edited by butterw; 5th Oct 2020 at 10:16.
    Quote Quote  
  2. intro to pixel shaders (.hlsl, .fx)

    Hlsl is a specialized (simple/powerful but limited) programming language with vector operations and c-syntax which runs realtime on the gpu.
    The input video is processed frame by frame. Each frame is processed pixel by pixel by the main function of the .hlsl file.
    the basic scalar datatype is float (float32), float2 and float4 are vector datatypes.
    a pixel has float2 (x, y) coordinates, and float4 (r, g, b, a) color (the alpha channel isn't used by mpc-hc)
    input/output values are in range [0, 1.]

    /* Code for the Invert.hlsl Shader in dx9 */
    Code:
    sampler s0; //the input frame
    
    float4 main(float2 tex: TEXCOORD0): COLOR { //tex: current output pixel coordinates (x, y)   
    	float4 c0 = tex2D(s0, tex);     //sample rgba pixel color (of pixel tex)
    	
    	return float4(1, 1, 1, 1) - c0; //return output pixel color (r, g, b, a)   
    }
    if required you can also access (all floats):
    - the frame dimensions W, H, the pixel width px, py
    - the frame Counter or seconds Clock, which start at 0 and are updated each frame.
    not much more to it except you can use #define expressions and pre-compiler instructions.
    #define Red float4(1., 0, 0, 0) //defines constant color Red.

    Note: dx11 pixel shaders have a more verbose boilerplate syntax. They have to be adapted to run on a dx9 renderer (ex: mpc-hc default EVR-CP renderer).


    Limitations:
    - runs on uncompressed realtime video, the gpu needs to be able to handle the Texture points and Arithmetic operations used: Not a problem at 1080p even on integrated graphics unless you need to access many other pixels to calculate each output pixel.
    - no built-in random number generator (PRNG).
    - if hw linear sampling is not supported by the gpu, uses nearest neighbor (as silent fallback).


    In mpc-hc (EVR-CP):
    - you can only access the current frame !
    - you cannot access any external (texture) file.
    - no external parameters. All parameters values are defined in source file (also no GUI sliders option).
    - no imports in shader source file. #include "filename.hlsl" is not supported.
    - no user-defined persistent variables or textures !
    - in a post-resize shader: coordinate calculation is off if not in fullscreen mode, also you can't access video file resolution, only screen resolution.
    . . . a post-resize shader applies to the black bars in fullscreen.
    - Clock starts when the player is opened, there is no way of getting current playtime (this would be useful to trigger transition effects in a playlist for instance).
    - the shader can't modify the resolution/aspect ratio of the frame provided by the video player (no user-defined resize shaders).
    - shader file is single pass only. You can however create a preset with chained shaders and even pass data between them using the alpha channel.
    - dx11 renderer: doesn't provide backwards compatibility with existing dx9 shaders !


    mpc-hc/be, Pre or Post-Resize Shader chain (can be saved as a shader preset):
    renderer (RGB) > Shader1 > Shader2 > Shader3 > Output (RGB)
    Last edited by butterw; 26th Oct 2020 at 09:03. Reason: Updated Limitations
    Quote Quote  
  3. Thanks for that.

    I know nothing about hlsl, but I did manage to fiddle with one of the existing pixel shaders. Feel free to add it to your collection.... or not.....

    BT.601 to BT.709 [SD][HD]
    Converts the colors regardless of resolution. The original pixel shader only worked for HD as it was intended to correct a video card driver problem around 700 years ago, but it's handy for correcting SD video encoded from HD sources as rec.709, if the renderer is ignoring any colorimetry information, or it was encoded with Xvid so it doesn't have any.
    Image Attached Files
    Quote Quote  
  4. I'm guessing at least some avisynth methods could be ported to pixel shaders.

    I saw your FrostyBorders script in your sig. May I ask what method you use to fill these borders https://i.postimg.cc/yNPx5BKV/Frosty-Borders-2.jpg ?

    A heavy 2D gaussian blur would be too gpu intensive for integrated graphics, I am looking for cheaper alternatives.
    Quote Quote  
  5. Originally Posted by butterw View Post
    A heavy 2D gaussian blur would be too gpu intensive for integrated graphics, I am looking for cheaper alternatives.
    The final resizing and blurring steps for one edge are below (the script defaults).
    The main filtering prior to that is TemporalSoften(7, 255, 255, 40, 2).

    Frosty_Left = Blend_Left.GaussResize(BL, OutHeight)\
    .FastBlur(Blur=50, iterations=3, dither=true)\
    .AddGrain(var=5.0, constant=true)

    AddGrainC.
    http://avisynth.nl/index.php/AddGrainC
    FastBlur.
    https://forum.doom9.org/showthread.php?t=176564
    http://avisynth.nl/index.php/FastBlur
    It's closed source though. I can't remember if there's a reason for that. You might have to contact wonkey_monkey to see if he'll share source code if you need it.

    The script originally blurred by running the borders through the QGaussBlur function a couple of times (buried here), but then I discovered FastBlur and the result was better.
    Last edited by hello_hello; 29th Jun 2020 at 13:35.
    Quote Quote  
  6. Unfocused effect looks nice, but big radius blurs don't come cheap (requires huge kernel size >5*Sigma).

    fastblur uses 3 passes of boxblur to approximate a gaussian blur.
    The implementation of boxblur can be heavily optimized (the optimizations will be different on cpu and gpu)
    Also for gaussian blur, you would need to hardcode the filter kernel coefficients in a pixel shader, which is troublesome when you want to try different parameters.
    Game engines have heavily optimized multipass shader implementations of blur, but even then they rely on the power of an external gpu.
    I don't know if perf-optimized implementation can run even at 720p30 on old integrated graphics.

    naive implementation of single pass 2D-boxblur 3x3kernel with gpu pixel shader:
    - Mean function: effect is barely visible with a single pass with such a small kernel.
    - Per pixel: 9 texture lookups, 15 arithmetic operations.

    Better results with an optimized 2pass gaussian Blur (9tap): bShaders\blurGauss.hlsl >> blurGauss_Y.hlsl
    - Per pixel: 2 passes*(5 texture, 8 arithmetic)
    to increase the blur, you can use the shader multiple times (ex: 5times would be equivalent to a 45tap shader)

    For heavier blur with pixel shaders, Kawase (or Dual Kawase) method seems to be the way to go:
    https://software.intel.com/content/www/us/en/develop/blogs/an-investigation-of-fast-re...lgorithms.html

    Optimized boxblur (3x) separable filter, radius parameter k, kernel size: 4*k-1
    (2*k texture, 17 arithmetic) *6 iterations with downscaling/upscaling.
    Last edited by butterw; 26th Nov 2020 at 08:39. Reason: added intel link, boxblur info
    Quote Quote  
  7. Intro to glsl fragment shaders (mpv .hook, .glsl, .frag)

    fragment shaders (.glsl) are the OpenGL equivalent to DirectX pixel shaders (.hlsl).
    A significant amount of open-source .glsl shader code is available because they are used in linux, android, and WebGL.
    There are some differences between glsl and hlsl, but the syntax is similar and porting code between hlsl and glsl is typically possible.
    - Ex, vector type names: vec4 instead of float4

    The cross-platform mpv video player allows different types of user pixel shaders (in glsl based mpv .hook format). There is currently only a limited library of effects shaders available for mpv however, probably because some adaptation work is required to create a .hook shader from glsl code.
    More information on .hook shader syntax in the mpv manual: https://mpv.io/manual/master/#options-glsl-shader

    mpv processing pipeline:
    Video file, ex: yuv420 720p > gpu hw-dec >> LUMA, CHROMA source > NATIVE (resizable) >> yuv-to-rgb conversion >> MAIN (resizable) >> (LINEAR) > PREKERNEL >> scale >> OUTPUT > Screen, ex: rgb32, 1080p


    The following example is an Inversion pass applied to source LUMA. Tested with mpv v0.32 (Win10, gpu-api=d3d11):

    Code:
    //!HOOK LUMA
    //!BIND HOOKED
    //!DESC Invert Luma
    
    /* mpv: "./shaders/InvertLuma.hook" */
    vec4 hook() { // main function
        float luma = HOOKED_texOff(0).x;  
        return vec4(1.0 - luma); // return output pixel value
    }
    // uniform program parameters, updated every frame: "random" PRNG [0, 1.0], "frame" is an integer frame counter.

    Mpv shaders have less limitations than user shaders in mpc-hc/be:
    - selective pass execution (HOOK point, WHEN condition)
    - multipass shaders (can use output textures of previous passes)
    - source shaders (ex: LUMA shader working on yuv420 from video decoder)
    - chroma upsampling shader
    - pre-scalers (custom high-quality resizers)
    - compute shaders (code runs by thread in workgroup blocs with shared memory).
    . . . . CS5.0 gpu limitations: number of threads (ex: 32*32=1024), 32KB of shared memory (ex: 88x88*float or 48x48*float3) !!!

    mpv features:
    - different configuration [profile] sections can be triggered based on input

    Limitations
    - Pre-scaler textures can be resized at input or output of a pass (with WIDTH and HEIGHT parameters, I assume using hw linear sampling). The size of the output frame cannot be changed however (textures will be resized by the selected built-in scaler to fit the frame).
    - Shaders can read an embedded hex-encoded texture (ravu uses this for a 2D-LUT), but different platform may use different data formats and no tool is provided to convert to the required format (ex: rgba16hf). An external png texture would be much easier to deal with for the user.
    - Input alpha channel is 1 (not 0 !), output alpha channel isn't used. You can't use the alpha channel to pass data to the next shader pass.

    mpv disadvantages:
    - steeper learning curve, not as easy to make changes (ex: no GUI to change shader preset), modifying shader (ex: changing a parameter value) requires player restart.
    - on windows, compiler output is only displayed when there is an error in the shader (blue screen) and mpv is launched from command line. Compiler is maybe a bit more picky (ex: type conversions need to be explicitly specified). It also doesn't report on performance (texture fetches, number of math ops).
    Last edited by butterw; 26th Nov 2020 at 08:47. Reason: +Limitations
    Quote Quote  
  8. Compute Shader pass (mpv glsl .hook)

    To demonstrate compute shader workgroups, we color in blue the bloc with ID (2, 1) in the output image.
    Each thread of a Workgroup has local and global (integer vector) IDs and gets executed in parallel.
    The required number of Workgroups will be created to process the input frame.

    //!HOOK MAIN
    //!BIND HOOKED
    //!COMPUTE 16 16 //Workgroup bloc size(x, y) is defined as 16*16 threads, with 1 thread per input pixel.
    //!DESC mpv Compute Example

    #define WkgID gl_WorkGroupID
    #define LoID gl_LocalInvocationID
    #define GlobID gl_GlobalInvocationID
    #define Blue vec4(0 ,.5, 1, 1)

    void hook() { //per thread execution, in WorkGroup blocks
    vec4 color = HOOKED_tex(HOOKED_pos); //Read input texel
    if (WkgID.x==2 && WkgID.y==1) color = Blue;
    ivec2 coords = ivec2(GlobID); //Global (x, y) ID for pixel threads ex: (100, 240)
    imageStore(out_image, coords, color);
    }

    Compute Shaders have a slightly different syntax vs fragment shaders (they don't return a pixel color value, but instead output an image) and threads have integer IDs. Compute shaders (CS) have more control over threads than fragment shaders, leading to better performance in some applications. However this also means they are more complex to write and optimize. The main added feature is fast per workgroup shared memory, which is useful for convolution kernels for instance.

    CS in mpv:
    - input frame resolution / workgroup size: number of workgroups.
    . . . you choose the number of threads in the workgroup (can be less than 1 thread per pixel)
    - input resolution: output resolution
    . . . its possible to do multiple writes to the output image
    - group shared memory must be initialized before it can be used.
    - group threads are executed in parallel: thread concurrency must be handled.
    Last edited by butterw; 12th Nov 2020 at 16:38. Reason: +more details
    Quote Quote  
  9. # Embedded textures in Mpv Shaders

    Mpv allows textures (1D, 2D or 3D) embedded in .hook shaders (as described in the texture block section of the mpv manual). Such textures could be used by pixel/compute shaders for palettes, look-up-tables (LUT), pre-generated noise patterns, fonts, logos, patterns, etc.
    Here's an example showing how to do it.
    The full source code is available here

    The following metadata defines a texture named DISCO of Width 3 and Height 2.
    Textures can be much larger, but I don't know what the practical limit is yet.
    Data format is rgba8 (normalized unsigned integer with 8bit per component, Unorm8).
    The Sampling filter is NEAREST neighbor (could be LINEAR) and the Border mode is set to REPEAT.

    Code:
    //!TEXTURE DISCO
    //!SIZE 3 2
    //!FORMAT rgba8
    //!FILTER NEAREST
    //!BORDER REPEAT
    00ff11ffff00001100000000ffffffff00ff11fffff00ffaa
    !!! The data format must be supported by the gpu/gpu-api (dx11 here).
    The list of data formats supported by gpu-api is listed (under Texture formats) in the debug-log: mpv --log-file=mpv_log.txt video.mp4
    The actual data is a string of bytes in hexadecimal notation that define the raw texture data, corresponding to the format specified by FORMAT, on a single line with no extra whitespace. The length of the data must match the texture exactly, otherwise the texture will not be loaded, and the error will be reported in the launch terminal.
    The xy texture values run line by line:
    0 1 2
    3 4 5

    Converting decimal values to Hexadecimal encoded binary really requires a conversion tool (and none is provided !), however it is easy enough in the case of rgba8:
    ff000011 is the color Red (255, 0, 0) or #FF0000 hex-color, with an alpha channel value of 17 (0-255 Unorm8: XX hexa).

    In mpv0.32/dx11 rgba16f didn't work for me, only rgba16hf was available. Encoding to hex-encoded fp16 format is not so simple: 1-bit sign (S), a 5-bit exponent (E), and a 10-bit mantissa (M), signed fp16: XXXX hexa ?
    Last edited by butterw; 25th Nov 2020 at 09:21.
    Quote Quote  
  10. # 3D lut gpu shaders for Mpv

    I've now figured out how to convert .cube and .haldclut color luts to .hook shaders.
    https://github.com/butterw/bShaders/tree/master/mpv/lut

    So far I've just uploaded a sepia shader, but will be looking to expand to a Top-10 Color Effects shader Collection (there are lots of luts available).

    For tone and black&white shaders using a small lut seems adequate (lut-4, 64 values).
    The shader approach achieves much better performance than on cpu, meaning it will play smoothly at 1080p on an old PC.
    Larger clut-64 with 512x512 values are possible (shader filesize: 2MB, rgba8: 1MB).
    Quote Quote  
  11. Hi there, Butterw -- Thanks for posting all this. Has been a useful crash corse in a way the MPV docs haven't.

    I can't tell from browsing. your code, do any of your effects accept an external (non-MPV) variable as a universal for the shaders? I'm trying to pass in a vec4 rgba color correction, maybe via JSON-IPC, but can't seem to figure out the best approach there. In theory, I could constantly create new shaders with the values built in on the fly and load/unload them on MPV with JSON-IPC, but that seems clunky when all I really need it to change the value of a universal.

    Thanks again!
    Quote Quote  
  12. Is there a collection/list of mpv compatible .glsl shaders somewhere ? They seem to be spread all over the place, or not in mpv compatible format

    (I'm more interested in video/image processing, less interested in realtime playback)
    Quote Quote  
  13. Can the glsl shaders from https://github.com/libretro/glsl-shaders be easily converted to mpv /libpacebo compatible shaders?
    users currently on my ignore list: deadrats, Stears555
    Quote Quote  
  14. Originally Posted by _DigiT View Post
    Hi there, Butterw -- Thanks for posting all this. Has been a useful crash corse in a way the MPV docs haven't.

    I can't tell from browsing. your code, do any of your effects accept an external (non-MPV) variable as a universal for the shaders? I'm trying to pass in a vec4 rgba color correction, maybe via JSON-IPC, but can't seem to figure out the best approach there. In theory, I could constantly create new shaders with the values built in on the fly and load/unload them on MPV with JSON-IPC, but that seems clunky when all I really need it to change the value of a universal.

    Thanks again!
    No, in this case you must change the correction value in the source code of the shader itself (use a variable with #define) and reload the shader.
    The alternative is to have multiple shaders each one with a different set of parameter values.

    Shaders can access variables in pre-defined registers, but mpv doesn't allow the user to set these values.
    Quote Quote  
  15. Originally Posted by poisondeathray View Post
    Is there a collection/list of mpv compatible .glsl shaders somewhere ? They seem to be spread all over the place, or not in mpv compatible format

    (I'm more interested in video/image processing, less interested in realtime playback)
    Mpv shaders are written in glsl using a specific .hook syntax and can only be used in mpv video player.

    There is no complete list of available shaders.
    Some shaders are listed here: https://github.com/mpv-player/mpv/wiki/User-Scripts#user-shaders

    Edit: I don't know what tools (if any) are currently available for video processing using gpu shaders and what glsl type syntax they require.
    Last edited by butterw; 27th Jun 2022 at 11:30.
    Quote Quote  
  16. Originally Posted by Selur View Post
    Can the glsl shaders from https://github.com/libretro/glsl-shaders be easily converted to mpv /libpacebo compatible shaders?
    Some can likely be ported, but you need a basic understanding of glsl/hook syntax.

    It is not possible to convert shaders using features of libretro not available in mpv.
    Quote Quote  
  17. https://github.com/butterw/bShaders/tree/master/A-pack

    Increasingly PC video players are used to watch internet video (directly or downloaded with yt-dlp), and these videos could often benefit from some quick adjustment. Modern mobile apps (ex: photo editors) have easy to use shaders for image adjustment. There is currently nothing equivalent for PC video players. So I will soon be releasing a small updated pack of essential pixel shaders (available for dx9 hlsl, dx11 hlsl and mpv hook, license: GPL v3) together with instructions on how to install/use with mpc-hc/be, mpv).

    The shaders will typically have no more than one customizable parameter, typically in the range [-1, 1] with 0 meaning no effect.
    The default values will need to make sense, avoiding the need for editing them after installation.
    The shaders are typically off, and you switch one on when needed (via menu or keyboard shortcut).

    Use case:
    - The video is too dark
    - The video is over-exposed
    - The video lacks contrast
    - The darks are not dark enough
    - The colors are weak
    - The video is over-saturated
    - I want to watch the video dimmed down, in black&white, with a color filter, etc.

    For the initial release (bShaders pack v1.0 "Color") I am thinking of including the following color adjustments:
    - brighten1: lift(0.05)
    - brighten2: Exposure(0.20)
    - Shadows: darken(-0.10)
    - Contrast: expand rgb(10, 240) to full
    - Vibrance(0.30): saturate color
    - desaturate: Saturation(-0.40)
    - dim: Exposure(-0.35)

    and some color filters (ex: black&white)

    Suggestions welcome.
    Last edited by butterw; 29th Jun 2023 at 16:27.
    Quote Quote  
  18. Custom-shader with tunable parameters (libplacebo, mpv --vo=gpu-next)

    example:
    https://github.com/butterw/bShaders/blob/master/mpv/bShadows_next.hook

    use:
    with default values: > mpv --vo=gpu-next --glsl-shaders="s/bShadows_next.hook;" video1.mp4
    with parameter shadows=0.10 and mode=8:
    > mpv --vo=gpu-next --glsl-shaders="s/bShadows_next.hook;" --glsl-shader-opts=shadows=0.10,mode=8 video1.mp4

    With vo=gpu-next, parameter values can be passed to the shader via command-line. ! Parameter names are case-sensitive.
    To achieve this, the new libplacebo PARAM block must be used (! .hook is not backwards compatible with vo=gpu).
    see: https://libplacebo.org/custom-shaders/#param-name

    Edit: With vo=gpu-next it's possible to write a shader where the parameter values are tunable without the need to edit the shader. The alternative is to have one shader per set of parameter values (ex: bShadows.10.hook, bShadows.15.hook, bShadows-10.hook, etc.), however this approach becomes unpractical if there are many possible values for the parameter and/or multiple tunable parameters.
    Last edited by butterw; 30th Jun 2023 at 09:08.
    Quote Quote  
  19. I wonder, I wrote some code which checks the image for the darkest color and the brightest, then I stretch every pixel in order to have the darkest pixel at black and the brightest pixel at 255,255,255 (not exactly, yellow doesnt become white lol). Well here is my code, can anybody make this into a shader? My function uses 255 color indicies instead of 1.0 in floating point but thats easy to do.
    function imglight(dst:timage,l#=255,src:timage=null)
    if not src then src=dst
    local pix:tpixmap=lockimage(dst)
    local minc%[]=[$7fffffff,$7fffffff,$7fffffff]
    local maxc%[3]
    for local y%=0 until src.height
    for local x%=0 until src.width
    for local a%=0 until 3
    if (maxc[a]<src.pixmaps[0].pixels[(y*dst.width+x) shl 2+a]) then maxc[a]=src.pixmaps[0].pixels[(y*dst.width+x) shl 2+a]
    if (minc[a]>src.pixmaps[0].pixels[(y*dst.width+x) shl 2+a]) then minc[a]=src.pixmaps[0].pixels[(y*dst.width+x) shl 2+a]
    next
    next
    next

    for local a%=1 until 3
    if (minc[0]>minc[a]) then minc[0]=minc[a]
    if (maxc[0]<maxc[a]) then maxc[0]=maxc[a]
    next

    local delta%=maxc[0]-minc[0]
    if (delta-l=0) then return
    for local y%=0 until dst.height
    for local x%=0 until dst.width
    for local a%=0 until 3
    local c%=min(255,(src.pixmaps[0].pixels[(y*src.width+x) shl 2+a]-minc[0])*l/delta)
    pix.pixels[(y*dst.width+x) shl 2+a]=c
    next
    next
    next
    unlockimage dst
    endfunction
    Last edited by sinjin; 15th Dec 2023 at 14:28.
    Quote Quote  
  20. pixel shader work on individual pixels not on the image: you can't get the darkest and brightest pixel of the image.
    Quote Quote  
  21. Can 2 shaders communicate with each other? So the 1st shader could collect the data and then send it to the 2nd shader. Also, in some shaders I saw you can get the width and height, what for?
    Last edited by sinjin; 16th Dec 2023 at 11:01.
    Quote Quote  
  22. You can chain shader passes. A compute shader could get the min, max rgb of the image, then you could perform the rgb stretch.
    This would only be available in mpv or reshade and getting it to work might not be trivial.

    With a libplacebo shader in mpv, you could pass the min/max frame values to a pixel shader using a script. You would still need to find a way to calculate min/max.

    Overall, using Avisynth+ might be a simpler alternative to shaders.
    Quote Quote  
  23. Yep, I was looking yesterday for avisynth scrips too, but I guess i didnt searched long enough. Another idea came to mind if you can in one shader save data for the next frame, so getting the min and max of one image and at the same time apply the main code and collect the min and max for the next frame. It might flicker a little on cuts but better than noting maybe But thanks for your help.
    Im just tired to adjust birhgtness and stuff for older washed out videos .
    Quote Quote  
  24. Maxing contrast per frame isn't actually a good approach for video. Some scenes are supposed to have less contrast/brightness than others, you are better off applying a fixed increase per video.
    Have you tried my existing shaders for this ex: sCurve ? Maybe you would need to adjust the Strength parameter once, but my guess is that it would work pretty well afterwards.
    Quote Quote  
  25. S-curve with higher values is not bad, but can you write my shader so I can see the difference? If people watch as is intended then why shaders exist in the 1st place? And btw, instead of a slider which I have to adjust for evey movie/and in between sometimes, now I have code lol I just want the best image evreytime automatically ^^
    Quote Quote  
  26. Though it may work OK in some cases, here are 3 reasons why per frame automatic histogram stretching isn't going to work with video:
    1) max and min are imperfect measures because a single pixel (ex: noise) is enough to jinx the value. A statistical measure (ex: 1% Low, 1% High) would be more meaningful.
    2) Logos, Black Bars added in editing frequently use Whites/Blacks, they will mask image characteristics.
    3) No temporal coherency. When it fails, it will fail bigly.

    You can also try expand10_240 which is fixed version of Histogram expansion. This may clip so only use it if the source requires it.
    Quote Quote  
  27. After getting the values you could smooth it with even older values from 2 frames ago, or add some speed parameter, so the effect would not be so sudden per frame. As for a logo, simply leave out a area to get the values from.

    I gave you source code and so many ideas now, maybe somebody else is willing to make a shader like this then? Thanks
    Quote Quote  
  28. Here is even what it would look like and I know of a 3rd method which can even bring back the colors.
    Image
    [Attachment 75613 - Click to enlarge]
    Quote Quote  
  29. Is there any video tutorial how to use this barMask shader? How can I switch between the 4 modes (Maskbox, MaskCrop, RatioLetter...)? I havent done this before. I loaded some of your shaders from github like vibrance.35 and looks fantastic.

    Also in this images theres is "crop 2.35". Is this is shader which crops the image from 16:9 to 2.35:1?
    https://github.com/butterw/bShaders/blob/master/MPC-HC_Shaders-Menu.jpg

    + is there any difference between vibrance.35 in A-Pack/Shaders and A-Pack/Shaders11 ?

    Thank you!
    Last edited by TheCage; 5th Jan 2024 at 12:08.
    Quote Quote  
  30. Originally Posted by TheCage View Post
    Is there any video tutorial how to use this barMask shader? How can I switch between the 4 modes (Maskbox, MaskCrop, RatioLetter...)? I havent done this before.
    You change the value of the Mode parameter at the top of the shader to the desired value then you save.
    #define Mode 3
    In mpc-hc, the modification auto-applies. In mpc-be you would need to restart the player.
    Sometimes there are additional parameters, ex: RatioLetterbox (Ratio>1)
    #define Ratio 2.35 //(21/9=2.333, 32/9=3.556)

    Also in this images theres is "crop 2.35". Is this is shader which crops the image from 16:9 to 2.35:1?
    I think it was masking rather than actual cropping. Image masking&shifting is quite simple with shaders.

    + is there any difference between vibrance.35 in A-Pack/Shaders and A-Pack/Shaders11 ?
    There shouldn't be (other than a slightly different syntax). Dx11 shaders are needed for mpc Video Renderer.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!