VideoHelp Forum

+ Reply to Thread
Results 1 to 14 of 14
Thread
  1. Hello everyone,

    Im looking for a way to compare 2 (possible) different video streams of the same source via python or another kind of software.

    Long story short, I got a WEB rip of a TV show and recently the Blu-Ray got released (with possible corrected animation frames).

    Is there any way I can run a script for ep 01 from WEB vs BD and see which frames are different?

    Thanks
    Quote Quote  
  2. download this: https://files.videohelp.com/u/198160/Comparator.7z , unpack into a directory, select both videos in windows explorer and put them on Comparator.exe,

    it is a very early version from a time ago, which I'am finishing into a proper package(slowly), but I bet it will work. Use keys '1' and '2' to switch between clips, spacebar is playback, mouse wheel zooms in and out etc., it is a self sustained package and does not install anything.
    Quote Quote  
  3. I usually use AviSynth to compare videos. You can stack them side by side or vertically. Or you can interleave frame by frame. The former are good for seeing the videos in motion or for quick scans for missing segments. The latter for pixel peeping between the two videos (open the script in an editor like VirtualDub2 and use the arrow keys to move back and forth between the same frame of two videos, it's easy to see even minute differences).
    Quote Quote  
  4. Thanks guys!

    I was actually wondering if its possible to do it automatically, without me being needed to go frame by frame.
    Quote Quote  
  5. Avisynth, vapoursynth, but you mentioned python so perhaps working with it is no problem, you can choose python and using modules skimage, opencv and plotlib to have visual graph output:
    Code:
    #python3
    #variation for link below
    #https://stackoverflow.com/questions/56183201/detect-and-visualize-differences-between-two-images-with-opencv-python
    
    from skimage.metrics import structural_similarity
    import cv2
    import numpy as np
    import matplotlib.pyplot as plt
    
    def process_img(image1, image2):
        # Convert images to grayscale
        image1_gray = cv2.cvtColor(image1, cv2.COLOR_BGR2GRAY)
        image2_gray = cv2.cvtColor(image2, cv2.COLOR_BGR2GRAY)
    
        # Compute SSIM between the two images, score is between 0 and 1, diff is actuall diff with all floats
        (score, diff) = structural_similarity(image1_gray, image2_gray, full=True)
        return score * 100
    
    vidcap1 = cv2.VideoCapture(r'D:\videos\vid1.mp4')
    vidcap2 = cv2.VideoCapture(r'D:\videos\vid2.mp4')
    y_axis = []
    frame_number = 0
    while True:
        success,image1 = vidcap1.read()
        if not success:
          break
        success,image2 = vidcap2.read()
        if not success:
          break
        score = process_img(image1, image2)
        frame_number += 1
        y_axis.append(score)
        
    x_axis = list(range(0, frame_number))
    plt.title("differences between videos")
    plt.xlabel("Frames")
    plt.ylabel("% of difference")
    plt.scatter(x_axis, y_axis, s=10, c='red')
    plt.show()
    I did not tested that, I just quick assembled it based on that web site code , added opencv capture, passing it to a function that compares frames and then it plots at the end, so there might be mistakes!, it is just an idea, did not run it, I do not have skimage.
    Last edited by _Al_; 29th Jan 2023 at 13:35.
    Quote Quote  
  6. Originally Posted by _Al_ View Post
    Avisynth, vapoursynth, but you mentioned python so perhaps working with it is no problem, you can choose python and using modules skimage, opencv and plotlib to have visual graph output:
    Code:
    #python3
    #variation for link below
    #https://stackoverflow.com/questions/56183201/detect-and-visualize-differences-between-two-images-with-opencv-python
    
    from skimage.metrics import structural_similarity
    import cv2
    import numpy as np
    import matplotlib.pyplot as plt
    
    def process_img(image1, image2):
        # Convert images to grayscale
        image1_gray = cv2.cvtColor(image1, cv2.COLOR_BGR2GRAY)
        image2_gray = cv2.cvtColor(image2, cv2.COLOR_BGR2GRAY)
    
        # Compute SSIM between the two images, score is between 0 and 1, diff is actuall diff with all floats
        (score, diff) = structural_similarity(image1_gray, image2_gray, full=True)
        return score * 100
    
    vidcap1 = cv2.VideoCapture(r'D:\videos\vid1.mp4')
    vidcap2 = cv2.VideoCapture(r'D:\videos\vid2.mp4')
    y_axis = []
    frame_number = 0
    while True:
        success,image1 = vidcap1.read()
        if not success:
          break
        success,image2 = vidcap2.read()
        if not success:
          break
        score = process_img(image1, image2)
        frame_number += 1
        y_axis.append(score)
        
    x_axis = list(range(0, frame_number))
    plt.title("differences between videos")
    plt.xlabel("Frames")
    plt.ylabel("% of difference")
    plt.scatter(x_axis, y_axis, s=10, c='red')
    plt.show()
    I did not tested that, I just quick assembled it based on that web site code , added opencv capture, passing it to a function that compares frames and then it plots at the end, so there might be mistakes!, it is just an idea, did not run it, I do not have skimage.
    Thank you, I just wanted to know if something can be automatised this way. Ill get back to studying then! Thanks!
    Quote Quote  
  7. Structural Similarty (SSIM) is somewhat useful for comparing different encodings of the exact same material. It's typically used to compare a source and (re)encoded video, as a gauge of how well the encoded video matches the source, or which of two (re)encoded videos better matches the source. It's not useful for finding if a video has extra or deleted scenes -- except that the SSIM will be very bad past the first addition/deletion.
    Quote Quote  
  8. Question is quite vague, not sure what "corrected animation frames" means.
    That plot would just reveal first jump, where things go wrong, or he could set a threshold to look for that change and then stop it, because if frames are added or deleted, it is not useful after that spot.
    But there would certainly be differences at the beginning of videos, so some manual sync would be needed. Or maybe that was a part of the question, to come up with a solution to find first sync frames, mark it and then look for differences. If found, mark it and then again look for a sync again. Repeat. That sounds pretty complicated.
    Quote Quote  
  9. There will also be problems if the two videos have different framing, different frame sizes, bit depths, color formats, etc.

    If you want an off-the-shelf SSIM (or PSNR, VMAF, etc.) file compare utility you can use ffmpeg (command line).

    drag/drop batch file for psnr:
    Code:
    ffmpeg -i %1 -i %2 -lavfi "[0:v][1:v]psnr" -f null -
    drag/drop batch file for ssim:
    Code:
    ffmpeg -i %1 -i %2 -lavfi "[0:v][1:v]ssim" -f null -
    drag/drop batch file for vmaf:
    Code:
    ffmpeg -i %1 -i %2 -filter_complex libvmaf=n_threads=8 -f null -
    Last edited by jagabo; 29th Jan 2023 at 23:38.
    Quote Quote  
  10. Originally Posted by _Al_ View Post
    Question is quite vague, not sure what "corrected animation frames" means.
    That plot would just reveal first jump, where things go wrong, or he could set a threshold to look for that change and then stop it, because if frames are added or deleted, it is not useful after that spot.
    But there would certainly be differences at the beginning of videos, so some manual sync would be needed. Or maybe that was a part of the question, to come up with a solution to find first sync frames, mark it and then look for differences. If found, mark it and then again look for a sync again. Repeat. That sounds pretty complicated.

    When people put out animated series on blue-ray, some may have frames corrected (derp faces or extra details), due to the studio having some extra time
    Quote Quote  
  11. https://github.com/DeltaFlyerW/AudioMatchVideoCut
    This script compares two video with their audio hash, and outputs the difference in the time lines. I wrote this to detect time-line displacement caused by censored video cuts on anime in China. In the early stages of development, I attempted to implement it using video frame-by-frame comparison, but abandoned it due to performance and frame difference issues.
    Quote Quote  
  12. I doubt you'll be able to reliably automatically detect such differences between streaming and Blu-ray versions of a show. The differences due to the encoding and other processing are likely to be larger than any extra details.
    Quote Quote  
  13. oh, This is such a cool thing on forums in general, to see that left hook coming from nowhere. In this case thinking to find video cuts and then hearing , hey, it is just the same and perhaps easier to check the audio instead ...
    Quote Quote  
  14. Originally Posted by jagabo View Post
    I doubt you'll be able to reliably automatically detect such differences between streaming and Blu-ray versions of a show. The differences due to the encoding and other processing are likely to be larger than any extra details.
    I was afraid I'll find this answer here
    Quote Quote  



Similar Threads