VideoHelp Forum
+ Reply to Thread
Results 1 to 15 of 15
Thread
  1. Member
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Hey guys.

    Back when I only had VGA graphics, 16 colors at 640×480 resolution, I always wondered how good video playback could be. At the time, all I had was a 386SX-20 with a 40MB hard drive, so there was no way on earth I'd ever be able to encode a video and play it back. But now hardware is... better : )

    The problem is, the software is too good. Of course I can encode to 640x480, but getting 16 colors is a problem. I don't know anything that will do that.

    Alternatively, if there's a playback emulator for VGA that would be cool.

    What do you think? Is it possible to encode a video limited to 16 colors? Or to play back something in 16-color mode?
    Quote Quote  
  2. An effect/Filter that reduces the color palette on playback to the standard 16 color VGA palette + dithering seems possible (SweetFX/Shaders/Nostalgia.fx used in Reshade).

    The results will likely look much be better for film content if you use a content optimized palette though.
    Last edited by butterw; 22nd Jun 2020 at 02:45.
    Quote Quote  
  3. You can create 16 color GIF animations with ffmpeg in a batch file.

    Code:
    @echo off
    ::** create animated GIF w/ optimized palette
    ::
    :: https://ffmpeg.org/ffmpeg-all.html#gif-2
    :: http://blog.pkh.me/p/21-high-quality-gif-with-ffmpeg.html
    :: http://superuser.com/questions/556029/how-do-i-convert-a-video-to-gif-using-ffmpeg-with-reasonable-quality
    
    if not exist "%~dpnx1" goto :EOF
    cd "%~dp1" 
    
    ::** generate palette
    @echo on
    @echo.
    "g:\program files\ffmpeg\bin\ffmpeg.exe" ^
     -v warning -i "%~nx1" ^
     -vf palettegen=max_colors=16 ^
     -y tmp-palette.png
    
    ::** generate GIF
    @echo.
    "g:\program files\ffmpeg\bin\ffmpeg.exe" ^
     -v warning -i "%~nx1" ^
     -i tmp-palette.png ^
     -lavfi "[0][1:v] paletteuse" ^
     "%~n1.16.gif"
    @echo off
    
    del /q tmp-palette.png
    
    if errorlevel 1 pause
    goto :eof
    You can drag/drop a video onto that batch file and it will output a gif animation with only 16 colors. You will have to change the path to ffmpeg.exe to match where it is on your computer.

    Image
    [Attachment 53894 - Click to enlarge]
    Last edited by jagabo; 21st Jun 2020 at 20:48.
    Quote Quote  
  4. This is where vapoursynth can have advantage using other Python module libraries. Like in this case PIL or nowadays Pillow. Image in PIL formate has a function to change it to 16 colors.
    So change_colors function changes format from vapoursynth RGB8bit to numpy then to PIL, it is reduced to 16 colors, and then back to PIL 8bit, then numpy, then vapoursynth RGB 8bit. Complicated? Yeah! But quite fast, those format transfers are mostly memory operations.
    Maybe there is easier way, most likely, just grabbed functions that already exists instead of manual calculations.

    Code:
    import vapoursynth as vs
    from vapoursynth import core
    from PIL import Image
    import numpy as np
    
    def change_colors(n,f):
        npArray = np.dstack([np.asarray(f.get_read_array(p)) for p in range(3)])
        PIL_img = Image.fromarray(npArray)
        fix_PIL_img = PIL_img.convert(mode='P', palette=Image.ADAPTIVE, colors=16)
        fix_PIL_img = fix_PIL_img.convert(mode='RGB')
        npArray = np.array(fix_PIL_img)
        f_out = f.copy()
        [np.copyto(np.asarray(f_out.get_write_array(p)), npArray[:, :, p]) for p in range(3)]
        return f_out
    
    clip = core.lsmas.LibavSMASHSource('source.mp4')
    clip = clip.resize.Point(format=vs.RGB24, matrix_in_s = '470bg')
    colors16 = core.std.ModifyFrame(clip, clip, change_colors)
    colors16.set_output(0)
    So output is RGB 8bit but it mocks/follows 16 color scheme (4bit).
    There is another issue, this most likely needs some sort of palette handling, so colors are consistent. Not sure how that VGA 16 color format is organized.
    Image Attached Thumbnails Click image for larger version

Name:	clip_01__1;1_[720, 480, 0, 0]_frame_0000266.png
Views:	103
Size:	609.1 KB
ID:	53896  

    Click image for larger version

Name:	clip_02__1;1_[720, 480, 0, 0]_frame_0000266.png
Views:	82
Size:	253.4 KB
ID:	53897  

    Last edited by _Al_; 21st Jun 2020 at 21:51.
    Quote Quote  
  5. instead of this line from vapoursynth script:
    Code:
    fix_PIL_img = PIL_img.convert(mode='P', palette=Image.ADAPTIVE, colors=16)
    use:
    Code:
    fix_PIL_img = PIL_img.convert(mode='P', dither = Image.FLOYDSTEINBERG, palette=Image.WEB)
    that uses dithering and creates palette with 16 colors default if I'm not mistaken. With dithering it looks much better like that example from jagabo.
    Quote Quote  
  6. dithered image with 16 colors:
    Image Attached Thumbnails Click image for larger version

Name:	dith_clip_02__1;1_[720, 480, 0, 0]_frame_0000266.png
Views:	80
Size:	230.2 KB
ID:	53905  

    Quote Quote  
  7. ok, this counts unique RGB colors in video (video as in numpy array):
    Code:
    flat_array = npArray.reshape(-1, 3)
    colors, counts = np.unique(flat_array, return_counts = True, axis = 0)
    that dither sample using palette=Image.WEB is no way close to 16 colors, it gives much more unique RGB colors (about 128), so not good.

    But that first solution using palette=Image.ADAPTIVE really gives 16 colors only. Tried quick to make it work with dithering but no success.

    (jagabo gifs give max 15 colors, there is room for one more color)
    Quote Quote  
  8. for comparison that nice ffmpeg solution gave this image for that car:
    Image Attached Thumbnails Click image for larger version

Name:	ffmpeg_gif_clip_02__1;1_[720, 480, 0, 0]_frame_0000266.png
Views:	61
Size:	344.4 KB
ID:	53916  

    Quote Quote  
  9. Don't mind me talking.
    Getting close to vapoursynth/python solution.
    Script is selecting a palette (unique 16 colors) first and that palette is used for rest of the images.
    So far, there is no average solution for those colors, like perhaps that ffmpeg solution, not sure, but it must be used that way.
    For now the proper frame is selected, palette is made from that frame and that palette is also used for all images. And using dithering.

    Code:
    import vapoursynth as vs
    from vapoursynth import core
    from PIL import Image
    import numpy as np
    
    colors = 16
    frame_for_palette = 100
    clip = core.lsmas.LibavSMASHSource('source.mp4')
    
    def _get_palette(rgb_frame, colors):
        npArray = np.dstack([np.asarray(rgb_frame.get_read_array(p)) for p in range(3)])
        #print(npArray)
        PIL_img = Image.fromarray(npArray)
        return PIL_img.convert(mode='P', palette=Image.ADAPTIVE, colors=colors).getpalette()
    
    def quantize_colors(n,f):
        npArray = np.dstack([np.asarray(f.get_read_array(p)) for p in range(3)])
        PIL_img = Image.fromarray(npArray).quantize(palette=PIL_ATTR).convert(mode='RGB')
        npArray = np.array(PIL_img)
        #print(PIL_img.getcolors()) #making sure what are unique RGB colors (PALETTE) and what is the count
        f_out = f.copy()
        [np.copyto(np.asarray(f_out.get_write_array(p)), npArray[:, :, p]) for p in [2,1,0]]
        return f_out
    
    rgb = clip.resize.Point(format=vs.RGB24, matrix_in_s = '470bg')
    PALETTE = _get_palette(rgb.get_frame(frame_for_palette), colors=colors-1)
    PIL_ATTR = Image.new("P", (1, 1), 0)
    PIL_ATTR.putpalette(PALETTE)
    #print(PALETTE)
    rgb_palette = core.std.ModifyFrame(rgb, rgb, quantize_colors)
    rgb_palette.set_output(0)
    Image Attached Thumbnails Click image for larger version

Name:	palette_200_clip_02__1;1_[720, 480, 0, 0]_frame_0000266.png
Views:	129
Size:	171.8 KB
ID:	53918  

    Last edited by _Al_; 22nd Jun 2020 at 22:10.
    Quote Quote  
  10. Using 16 colors that appear most in an image works well when there are only a few colors in an image (as seen above) but can lose a lot when there are lots of colors. Here's a shot from The Wizard of Oz:

    Image
    [Attachment 53919 - Click to enlarge]


    and after the "optimized" palette from my earlier ffmpeg batch file:

    Image
    [Attachment 53920 - Click to enlarge]


    I was wondering how well a generalized 16 color palette would work for any random video. First I thought that a palette with 5 shades each of red, green, and blue, plus black could theoretically generate all colors (not necessarily all intensities of each color) since that's how every digital image works. But ffmpeg's error diffusion algorithm didn't work well with that. It tended to leave the in-between colors (for example, cyan = blue+green) grey. So I thought I needed shades of those too. I started experimenting with that when I also remembered that early graphics cards could display 16 colors -- but not just any 16 colors. Only a specific 16 colors. For example the original IBM CGA card (I had one of those!) had this fixed 16 color palette:

    https://en.wikipedia.org/wiki/Color_Graphics_Adapter#Color_palette

    So I created that exact palette with an AviSynth script:

    Code:
    k = BlankClip(width=16, height=16, color=$000000) # 16x16 black box
    
    w = StackHorizontal(k, k.RGBAdjust(rb=$55, gb=$55, bb=$55), k.RGBAdjust(rb=$AA, gb=$AA, bb=$AA), k.RGBAdjust(rb=$FF, gb=$FF, bb=$FF))
    b = StackHorizontal(k.RGBAdjust(bb=$AA),         k.RGBAdjust(rb=$55, gb=$55, bb=$FF))
    g = StackHorizontal(k.RGBADjust(gb=$AA),         k.RGBAdjust(rb=$55, gb=$FF, bb=$55))
    c = StackHorizontal(k.RGBADjust(gb=$AA, bb=$AA), k.RGBAdjust(rb=$55, gb=$FF, bb=$FF))
    r = StackHorizontal(k.RGBAdjust(rb=$AA),         k.RGBAdjust(rb=$FF, gb=$55, bb=$55))
    m = StackHorizontal(k.RGBAdjust(rb=$AA, bb=$AA), k.RGBAdjust(rb=$FF, gb=$55, bb=$FF))
    y = StackHorizontal(k.RGBAdjust(rb=$AA, gb=$55), k.RGBAdjust(rb=$FF, gb=$FF, bb=$55))
    
    StackHorizontal(w,b,g,c,r,m,y)
    AddBorders(0,0,0,240)
    PointResize(16,16)
    cga-palette.png:

    Image
    [Attachment 53923 - Click to enlarge]


    Using that palette with the Oz image delivered:

    Image
    [Attachment 53921 - Click to enlarge]


    And a short GIF animation:

    Image
    [Attachment 53922 - Click to enlarge]


    Up close those look very grainy but from a distance almost all of the color of the original image is retained!

    If you want to experiment on your own the uncompressed RGB oz video is attached. And here's the ffmpeg batch file I used:

    Code:
    "g:\program files\ffmpeg\bin\ffmpeg.exe" ^
     -y -v warning -i "%~nx1" ^
     -i cga-palette.png ^
     -lavfi "[0][1:v] paletteuse" ^
     "%~n1.16.gif"
    Image Attached Files
    Last edited by jagabo; 22nd Jun 2020 at 22:34.
    Quote Quote  
  11. wow, so it is in magic of those selected colors in that palette and to choose 16 of those they represent most of the colors,
    thanks for that cga-palette.png. Using it in the script as a source for palette for both videos, yours "ozvid.avi" and my source 'beauty in red.avi',
    which kept colors surprisingly good, even if dominance of colors is completely different in that car clip.

    One cannot believe, it is only 16 unique colors, I verified it: in that ozvid.avi case , first frame for example. They are pairs of number of occurrence and RGB value, which are RGB's of that palette:
    [(9029, (255, 85, 85)), (13220, (170, 0, 0)), (29166, (170, 85, 0)), (658, (85, 85, 255)), (124391, (85, 85, 85)), (745, (85, 255, 85)), (1723, (0, 170, 0)), (1216, (255, 85, 255)), (39228, (170, 170, 170)), (3, (85, 255, 255)), (1674, (170, 0, 170)), (4909, (255, 255, 255)), (198, (0, 170, 170)), (1937, (255, 255, 85)), (3997, (0, 0, 170)), (75106, (0, 0, 0))]
    Code:
    import vapoursynth as vs
    from vapoursynth import core
    from PIL import Image
    import numpy as np
    
    rgb_palette_clip = core.ffms2.Source('cga-palette.png')
    colors = 16
    clip = core.ffms2.Source('ozvid.avi')
    #clip = core.ffms2.Source('beauty in red.avi') #clip for second image
       
    def _get_palette(rgb_frame, colors):
        npArray = np.dstack([np.asarray(rgb_frame.get_read_array(p)) for p in range(3)])
        #print(npArray)
        PIL_img = Image.fromarray(npArray)
        return PIL_img.convert(mode='P', palette=Image.ADAPTIVE, colors=colors).getpalette()
    
    def quantize_colors(n,f):
        npArray = np.dstack([np.asarray(f.get_read_array(p)) for p in range(3)])
        PIL_img = Image.fromarray(npArray).quantize(palette=PIL_ATTR).convert(mode='RGB')
        npArray = np.array(PIL_img)
        #print(PIL_img.getcolors()) #making sure what are unique RGB colors (PALETTE) and what is the count
        f_out = f.copy()
        [np.copyto(np.asarray(f_out.get_write_array(p)), npArray[:, :, p]) for p in [2,1,0]]
        return f_out
    
    if clip.format.color_family == vs.YUV:
        rgb = clip.resize.Point(format=vs.RGB24, matrix_in_s = '470bg')
    else:
        rgb=clip
    PALETTE = _get_palette(rgb_palette_clip.get_frame(0), colors=colors)
    PIL_ATTR = Image.new("P", (1, 1), 0)
    PIL_ATTR.putpalette(PALETTE)
    #print(PALETTE)
    rgb_palette = core.std.ModifyFrame(rgb, rgb, quantize_colors)
    rgb_palette.set_output()
    Image Attached Thumbnails Click image for larger version

Name:	oz_clip_02__1;1_[640, 480, 0, 0]_frame_0000000.png
Views:	67
Size:	203.3 KB
ID:	53925  

    Click image for larger version

Name:	red_beauty_clip_02__1;1_[720, 480, 0, 0]_frame_0000266.png
Views:	67
Size:	222.9 KB
ID:	53926  

    Last edited by _Al_; 22nd Jun 2020 at 23:56.
    Quote Quote  
  12. Originally Posted by _Al_ View Post
    wow, so it is in magic of those selected colors in that palette and to choose 16 of those they represent most of the colors
    I don't think it has to be exactly those colors. I originally used slightly different RGBCMYK colors and they looked pretty good too. But certainly you need to have the CMY values as well as RGB for the error diffusion to work well.

    Good work on the conversion to vapoursynth.
    Quote Quote  
  13. Just wait until you apply your 16-color VGA palette to some blue sky. Love them contours ...
    Quote Quote  
  14. Like this?

    Image
    [Attachment 53931 - Click to enlarge]


    Or this?

    Image
    [Attachment 53933 - Click to enlarge]


    I know what you're getting at -- shallow gradients are hard to get with a limited palette. But really, the issue isn't about how good one particular (or one particular type of) video will look. It's how well the 16 bit CGA palette can display a broad range of content.
    Last edited by jagabo; 23rd Jun 2020 at 21:10.
    Quote Quote  
  15. I just wanted to add, that palette could be hard-coded in script , defined by provided rgb color list, not by image:
    Code:
    import vapoursynth as vs
    from vapoursynth import core
    from PIL import Image
    import numpy as np
    
    #palette list has this pattern: R1,G1,B1, R2,G2,B2, .....  max 768 values, which is 768/3=256 max colors
    #if less than 256 colors, zeros need to be padded to 768 count 
    PALETTE = [255,  85,  85,  170,   0,   0,  170,  85,   0,  85,  85, 255,   85, 85,  85,  85, 255,  85,
                 0, 170,   0,  255,  85, 255,  170, 170, 170,  85, 255, 255,  170,  0, 170, 255, 255, 255,
                 0, 170, 170,  255, 255,  85,    0,   0, 170,   0, 0, 0] + [0]*(256-16)*3
    
    clip = core.ffms2.Source('ozvid.avi')
    #clip = core.ffms2.Source('beauty in red.avi') #clip for second image
       
    def quantize_colors(n,f):
        npArray = np.dstack([np.asarray(f.get_read_array(p)) for p in range(3)])
        PIL_img = Image.fromarray(npArray).quantize(palette=PIL_ATTR).convert(mode='RGB')
        npArray = np.array(PIL_img)
        f_out = f.copy()
        [np.copyto(np.asarray(f_out.get_write_array(p)), npArray[:, :, p]) for p in [0,1,2]]
        return f_out
    
    if clip.format.color_family == vs.YUV:
        rgb = clip.resize.Point(format=vs.RGB24, matrix_in_s = '470bg')
    else:
        rgb=clip
    
    PIL_ATTR = Image.new("P", (1, 1), 0)
    PIL_ATTR.putpalette(PALETTE)
    rgb_quantized = core.std.ModifyFrame(rgb, rgb, quantize_colors)
    rgb_quantized.set_output()
    Last edited by _Al_; 23rd Jun 2020 at 22:22.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!