VideoHelp Forum
+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 42
Thread
  1. Hello and Happy Saturday . Some years ago kind people here helped me develop an AviSynth script for encoding my Futurama NTSC DVD's that does about as good a job as possible given the poor quality of the DVD's. The only remaining flaws are inherent in the video:



    And, recently, kind people here helped me transform this script such that it now does a fine job of upscaling the Futurama NTSC DVD's to 720p. Thanks again . While I was preparing to upscale the 72 episode from seasons 1-4 I found myself bothered by the above-style video flaws, as they appear, albeit briefly, multiple times in each episode. So I did some checking, and it turns out the PAL version of the DVD's don't have this issue. Unfortunately, the PAL DVD's are limited to 2.0 192 Kbps audio, whereas the NTSC DVD's have 5.1 audio. Thus my question: is there any reasonable way to combine the PAL video with the NTSC audio. I know that the PAL video is 25 frames per second and the processed NTSC DVD's result in 23.976 frames per second. I understand that I could use an application, such as MeGUI, to speed up the NTSC audio, but the few times I've been forced to do this have resulted in very poorly synced audio. I guess, then, my question is actually: is there any elegant way to input 25 fps PAL video and output 23.976 fps? Thanks for any suggestions.
    Image Attached Thumbnails Click image for larger version

Name:	S1.E1-SpacePilot3000[FlawedVideo].jpg
Views:	1628
Size:	66.1 KB
ID:	48648  

    Quote Quote  
  2. Member
    Join Date
    Mar 2008
    Location
    United States
    Search Comp PM
    Since you're going to be re-encoding the PAL source, just slow the video down to 23.976 and use the NTSC audio as-is
    Quote Quote  
  3. Originally Posted by davexnet View Post
    Since you're going to be re-encoding the PAL source, just slow the video down to 23.976 and use the NTSC audio as-is
    davexnet: Thanks for your reply. How would you suggest accomplishing this during the encoding process?
    Quote Quote  
  4. Member
    Join Date
    Mar 2008
    Location
    United States
    Search Comp PM
    An Avisynth script, one that simply opens the source and uses AssumeFPS to slow the source video.
    Then you can encode it in any program that accepts the script
    Or you could encode the video directly in Virtualdub2, using video/frame rate/change frame rate to...

    Then when you have your new video, combine it with the audio using mkvtoolnix.avidemux, etc,etc
    Quote Quote  
  5. Here's the original upscale script for my NTSC DVD's:

    Code:
    DGINDEX SOURCE INFORMATION HERE
    ### Color Conversion ###
    ColorMatrix(Mode="Rec.601->Rec.709")
    ### Deinterlace-Match Fields-Decimate ###
    AssumeTFF()
    TFM(Chroma=False,PP=0) 
    AssumeBFF()
    Interleave(TFM(Mode=1,PP=0,Field=1),TFM(Mode=1,PP=0,Field=0))
    TFM(Field=0,Clip2=Yadif())
    vInverse()
    SRestore(23.976)
    ### Adjust Color ###
    MergeChroma(aWarpSharp2(Depth=19))
    SmoothTweak(Saturation=1.01)
    ### Crop ###
    Crop(8,0,-8,0)
    ### Gibbs Noise Block ###
    Edge=MT_Edge("prewitt",ThY1=20,ThY2=40).RemoveGrain(17)
    Mask=MT_Logic(Edge.MT_Expand().MT_Expand().MT_Expand().MT_Expand(),Edge.MT_Inflate().MT_Inpand(),"xor").Blur(1.0)
    MT_Merge(Minblur(),Mask,Luma=True)
    ### Overall Temporal Denoise ###
    SMDegrain(TR=1,ThSAD=200,ContraSharp=True,RefineMotion=True,Plane=0,PreFilter=2,Chroma=False,Lsb=True,Lsb_Out=False)
    ### Resize ###
    NNEDI3_RPow2(4,CShift="Spline64Resize",FWidth=960,FHeight=720)
    aWarpSharp2(Depth=5)
    Sharpen(0.2)
    ### Darken-Thin Lines ###
    Dither_Convert_8_To_16()
    F=DitherPost(Mode=-1)
    S=F.FastLineDarkenMod(Strength=24,Prot=6).aWarpSharp2(Blur=4,Type=1,Depth=3,Chroma=2)
    D=MT_MakeDiff(S,F).Dither_Convert_8_To_16()
    Dither_Add16(Last,D,Dif=True,U=2,V=2)
    ### Deband ###
    GradFun3(Radius=16,ThR=0.55,SMode=2,StaticNoise=True,Lsb_In=True,Lsb=True)
    DitherPost()
    And here's a sample from my PAL DVD's: S1.E1-FuturamaSample[PAL]

    Suggestions for a change to the "### Deinterlace-Match Fields-Decimate ###" section that would cleanly decimate the PAL video to 23.976 fps are much appreciated.

    EDIT: I'm much more interested in quality over speed.
    Last edited by LouieChuckyMerry; 14th Apr 2019 at 13:16. Reason: Information. Information...
    Quote Quote  
  6. Is the sample typical or an anomaly?

    Because pretty much all it needs is
    TFM()
    AssumeFPS(24000,1001)

    Except for the glitch around frame 118, but there's blending in both fields, so you either live with some blending for a frame, or you take it out and there'll be a glitch in the motion. Both top and bottom fields are blended for a couple of frames, so I don't think there's any way to extract a clean one.

    I had a look at your script, but it made my head hurt and I had to lie down for a bit.

    So the attached thingy is:

    Code:
    mpeg2source("D:\S1.E1-FuturamaSample[PAL].d2v", cpu=6)
    TFM()
    SomeFilteringCopiedFromAScriptStillOnMyHardDriveSoImNotClaimingItsClever()
    ExtraBandingBeGone()
    AssumeFPS(24000,1001)
    Image Attached Files
    Quote Quote  
  7. Originally Posted by hello_hello View Post
    Is the sample typical or an anomaly?

    Because pretty much all it needs is
    TFM()
    AssumeFPS(24000,1001)
    Typical. I took a clip with panning because that tends to look the worst.


    Originally Posted by hello_hello View Post
    Except for the glitch around frame 118, but there's blending in both fields, so you either live with some blending for a frame, or you take it out and there'll be a glitch in the motion. Both top and bottom fields are blended for a couple of frames, so I don't think there's any way to extract a clean one.
    How would one "take it out"? Any way to replace it somehow, like duplicating a "clean" frame next to it? Any idea how the [CC] encode was able to minimize that glitch?: S1.E1-Glitches[PAL]


    Originally Posted by hello_hello View Post
    I had a look at your script, but it made my head hurt and I had to lie down for a bit.
    I hope you're feeling better; you get used to it eventually .


    Originally Posted by hello_hello View Post
    So the attached thingy is:

    Code:
    mpeg2source("D:\S1.E1-FuturamaSample[PAL].d2v", cpu=6)
    TFM()
    SomeFilteringCopiedFromAScriptStillOnMyHardDriveSoImNotClaimingItsClever()
    ExtraBandingBeGone()
    AssumeFPS(24000,1001)
    Would you please share the remainder of your script; mine results in some kind of horizontal artifacts that are very annoying:



    EDIT: I ran the above script but with:

    Code:
    ### Deinterlace-Match Fields-Decimate ###
    QTGMC()
    SRestore(23.976)
    and there's no more glitch. But there are still those horizontal artifacts...
    Image Attached Thumbnails Click image for larger version

Name:	S1.E1-Artifacts[TCM()].jpg
Views:	3873
Size:	92.1 KB
ID:	48656  

    Last edited by LouieChuckyMerry; 14th Apr 2019 at 17:02. Reason: Clarity
    Quote Quote  
  8. Originally Posted by LouieChuckyMerry View Post
    How would one "take it out"? Any way to replace it somehow, like duplicating a "clean" frame next to it?
    I've never tried to interpolate missing frames, but I think there's a function for it.
    Much of the time, I'd repeat a frame and drop the problem one if possible
    Trim(0,117) + \
    Trim(117,117) + \
    Trim(119,0)
    but there's too much motion and you can see the glitch. Fortunately, there's plenty of motion where the blended frame happens so it's pretty hard to spot.

    You can minimise the blending by bossing TFM around a bit. I think it was something like this:

    A = TFM()
    B = TFM(pp=3, field=0)
    A.Trim(0,117) + \
    B.Trim(118,118) + \
    A.Trim(119,0)

    If I remembered the frame numbers correctly, it'll match and de-interlace in the other direction, and there's less blending, but it adds a slight glitch to the motion.

    Originally Posted by LouieChuckyMerry View Post
    Any idea how the [CC] encode was able to minimize that glitch?: S1.E1-Glitches[PAL]
    I missed the question. See my next post.

    Originally Posted by LouieChuckyMerry View Post
    Would you please share the remainder of your script; mine results in some kind of horizontal artifacts that are very annoying
    Maybe because ColorMatrix is too early in your script? The screenshot is from a section where the fields go out of alignment (from memory) so maybe if you TFM'd before ColorMatrix? Or you could try Interlaced=true, but I always color convert after de-interlacing or field matching.

    My script's a bit embarrassing. It's not very long and it's 8 bit all the way.
    It's probably largely ideas ideas borrowed from jagabo at some stage. Although not so long ago, I discovered GradFun3() and f3kdb() together can work quite well for animation, at least when you're staying in 8 bit mode. I was converting some old Family Guy episodes and GradFun3() got rid of most of the existing banding, but x264 still added some back when encoding. f3kdb() prevented most of that, but I didn't like the look of it. I'm not sure I could explain why exactly. Something about it didn't look natural to me, I guess. Together though....

    DGDecode_mpeg2source("D:\S1.E1-FuturamaSample[PAL].d2v", cpu=6)
    TFM()
    TTempSmooth()
    FastLineDarken(Thinning=0)
    MergeChroma(AwarpSharp(Depth=5), AwarpSharp(Depth=20))
    CropResize(960,720,8,2,-8,-2,InDAR=15.0/11.0,Resizer="Spline36") # CropResize also converts to rec.709
    FastLineDarken(Thinning=0)
    MAA()
    DeHalo_alpha()
    CSMod(strength=150)
    GradFun3(thr=1.0, thrc=1.0)
    f3kdb()
    AssumeFPS(24000,1001)
    Actually.... if you happen to give MAA() a spin, could you try MAA2() in a script with MeGUI? For some reason MAA2() always causes MeGUI to crash when I close the preview. MAA() is fine. I'd be interested to know if it's just my PC. The dither package does the same. I'm using Dither 1.27.1 because the newer versions don't play nice with MeGUI.
    MAA
    MAA2
    Last edited by hello_hello; 18th Apr 2019 at 07:25.
    Quote Quote  
  9. Originally Posted by LouieChuckyMerry View Post
    Chances are if the NTSC version was telecined, it glitched differently. AnimeIVTC has an option for fixing that sort of thing. It's described like this in the help file.

    chrfix : Use to correct chroma swap between fields (to find out, apply bob() on your clip and examine the frames. If at some point the chroma of a frame is in the other and vice-versa, the issue is present).

    I don't know how it works as such, and I've not used the option much myself, but it looks more like some sort of "chroma-swap" in the NTSC version. I don't really know how it gets that way either.
    Quote Quote  
  10. Originally Posted by LouieChuckyMerry View Post
    How would one "take it out"? Any way to replace it somehow, like duplicating a "clean" frame next to it?
    I didn't have much luck with frame interpolation so I used a picture editor (Photofiltre) to first fix frame #118 and AviSynth to stick it back into the video:

    TFM()
    Q=ImageSource("118.bmp",End=118).ConvertToYV12()
    ReplaceFramesSimple(Last,Q,Mappings="118")
    Image Attached Thumbnails Click image for larger version

Name:	118Before.jpg
Views:	109
Size:	69.0 KB
ID:	48657  

    Click image for larger version

Name:	118After.jpg
Views:	66
Size:	76.7 KB
ID:	48658  

    Quote Quote  
  11. That looks pretty good. I'll have to check out the Photofiltre program to which you refer.
    Quote Quote  
  12. Originally Posted by LouieChuckyMerry View Post
    EDIT: I ran the above script but with:

    Code:
    ### Deinterlace-Match Fields-Decimate ###
    QTGMC()
    SRestore(23.976)
    and there's no more glitch. But there are still those horizontal artifacts...
    Step through the frames where there's constant movement, and compare it to TFM().AssumeFPS(24000,1001).
    I think you'll find there's frames missing from the Srestore version (a frame per second for 24fps), and I suspect you're lucky it just happened to pick the bad frame to remove, this time.
    Quote Quote  
  13. I agree. With TFM alone and making it 25fps, during that long pan in the middle there are no duplicate frames as there would be if it was supposed to be 23.976fps. So, to get it to 23.976fps you use an AssumeFPS command after TFM and slow the audio to match.

    I'll have to check out the Photofiltre program to which you refer.
    Any picture editor can do the job. I use Photofiltre because it's good and free and I've learned how to use it pretty well. But I don't mind recommending it. You load the bad frame as well as the good frames on either side. Then you add pieces from both sides to 'restore' the bad one. In principle it's easy to do, and only takes time.
    Quote Quote  
  14. Originally Posted by hello_hello View Post
    Originally Posted by LouieChuckyMerry View Post
    Would you please share the remainder of your script; mine results in some kind of horizontal artifacts that are very annoying
    Maybe because ColorMatrix is too early in your script? The screenshot is from a section where the fields go out of alignment (from memory) so maybe if you TFM'd before ColorMatrix? Or you could try Interlaced=true, but I always color convert after de-interlacing or field matching.
    After my last post (see the EDIT) I ran a test with:

    Code:
    ### Deinterlace-Match Fields-Decimate ###
    QTGMC()
    SRestore(23.976)
    and the rest of the script the same, and the results were very interesting; the horizontal artifacts--please, what are they called--were still there but the single bad frame was gone. Then I ran the same script but with the deinterlacing before the color conversion, and the horizontal artifacts were gone but the single bad frame was back. This I don't understand .


    Originally Posted by hello_hello View Post
    My script's a bit embarrassing. It's not very long and it's 8 bit all the way.
    It's probably largely ideas ideas borrowed from jagabo at some stage. Although not so long ago, I discovered GradFun3() and f3kdb() together can work quite well for animation, at least when you're staying in 8 bit mode. I was converting some old Family Guy episodes and GradFun3() got rid of most of the existing banding, but x264 still added some back when encoding. f3kdb() prevented most of that, but I didn't like the look of it. I'm not sure I could explain why exactly. Something about it didn't look natural to me, I guess. Together though....
    DGDecode_mpeg2source("D:\S1.E1-FuturamaSample[PAL].d2v", cpu=6)
    TFM()
    TTempSmooth()
    FastLineDarken(Thinning=0)
    MergeChroma(AwarpSharp(Depth=5),AwarpSharp(Depth=2 0))
    CropResize(960,720,8,2,-8,-2,InDAR=15.0/11.0,Resizer="Spline36") # CropResize also converts to rec.709
    FastLineDarken(Thinning=0)
    MAA()
    DeHalo_alpha()
    CSMod(strength=150)
    GradFun3(thr=1.0, thrc=1.0)
    f3kdb()
    AssumeFPS(24000,1001)
    Why do you "repeat" the aWarpSharp call?


    Originally Posted by hello_hello View Post
    Actually.... if you happen to give MAA() a spin, could you try MAA2() in a script with MeGUI? For some reason MAA2() always causes MeGUI to crash when I close the preview. MAA() is fine. I'd be interested to know if it's just my PC. The dither package does the same. I'm using Dither 1.27.1 because the newer versions don't play nice with MeGUI.
    MAA
    MAA2
    I'll run a test after I post this reply if I'm still capable.


    Originally Posted by manono View Post
    Originally Posted by LouieChuckyMerry View Post
    How would one "take it out"? Any way to replace it somehow, like duplicating a "clean" frame next to it?
    I didn't have much luck with frame interpolation so I used a picture editor (Photofiltre) to first fix frame #118 and AviSynth to stick it back into the video:

    TFM()
    Q=ImageSource("118.bmp",End=118).ConvertToYV12()
    ReplaceFramesSimple(Last,Q,Mappings="118")
    That looks really good, but I doubt if even Matt Groening himself would sift through 72 episodes of Futurama and find all the frames that need fixing .
    Quote Quote  
  15. Originally Posted by hello_hello View Post
    Originally Posted by LouieChuckyMerry View Post
    EDIT: I ran the above script but with:

    Code:
    ### Deinterlace-Match Fields-Decimate ###
    QTGMC()
    SRestore(23.976)
    and there's no more glitch. But there are still those horizontal artifacts...
    Step through the frames where there's constant movement, and compare it to TFM().AssumeFPS(24000,1001).
    I think you'll find there's frames missing from the Srestore version (a frame per second for 24fps), and I suspect you're lucky it just happened to pick the bad frame to remove, this time.
    Yes, I'm slow. I ran a test a shortish while ago that output the first 2m32s of season one, episode one of Futurama with the QTGMC section for deinterlacing and the rest of the script per the original. Checking frame-by-frame (death by drowning) the results were the best yet. Honestly, I didn't see even one gltichy frame. The results with QTGMC and AssumeFPS were similar but not as good. A small sample size but promising.
    Quote Quote  
  16. A fading thought before drifting off?:

    Code:
    QTGMC()
    AssumeFPS(24000,1001)
    Quote Quote  
  17. Originally Posted by LouieChuckyMerry View Post
    A fading thought before drifting off?:

    Code:
    QTGMC()
    AssumeFPS(24000,1001)
    That's very good. Especially if you want to play all your Futurama encodes in super slo-mo, so you can enjoy each episode for much longer.
    Quote Quote  
  18. Originally Posted by manono View Post
    Originally Posted by LouieChuckyMerry View Post
    A fading thought before drifting off?:

    Code:
    QTGMC()
    AssumeFPS(24000,1001)
    That's very good. Especially if you want to play all your Futurama encodes in super slo-mo, so you can enjoy each episode for much longer.
    What's up with that? I let a single episode run overnight and MediaInfo shows 23.976 fps but the time of the episode seems to have about doubled and it does play in slow motion.
    Quote Quote  
  19. Originally Posted by LouieChuckyMerry View Post
    After my last post (see the EDIT) I ran a test with:

    Code:
    ### Deinterlace-Match Fields-Decimate ###
    QTGMC()
    SRestore(23.976)
    and the rest of the script the same, and the results were very interesting; the horizontal artifacts--please, what are they called--were still there but the single bad frame was gone. Then I ran the same script but with the deinterlacing before the color conversion, and the horizontal artifacts were gone but the single bad frame was back. This I don't understand .
    SRestore normally looks for blended frames and tries to keep the non-blended ones, but for a field blended source the blending usually happens in a fairly consistent pattern.
    Your sample isn't field blended (apart from a couple of frames) so to reduce the frame rate SRestore needs to delete "some" frames, and probably for a non-blended source, duplicate frames would be removed in preference to non-duplicates, but there's no blending pattern as such, which is why I said you probably got lucky the first time, and there'd have to be some sort of a glitch in motion where the blended fields are, because there's no clean fields to use.

    Originally Posted by LouieChuckyMerry View Post
    Why do you "repeat" the aWarpSharp call?
    Probably because jagabo told me to

    Because they're used inside MergeChroma, the first instance is only sharpening the luma (it's sharpening luma and chroma, but MergeChroma takes the chroma from the second instance), so by using only mild sharpening it's not doing horrible things to lines. The second instance can go to town on the chroma.

    Screenshot 1 is
    FastLineDarken(Thinning=0)
    Spline36Resize(1440,1080) # the resizing is only to make the differences easier to see.

    Screenshot 2 is
    FastLineDarken(Thinning=0)
    AwarpSharp(Depth=20)
    Spline36Resize(1440,1080)

    Screenshot 3 is
    FastLineDarken(Thinning=0)
    MergeChroma(AwarpSharp(Depth=5), AwarpSharp(Depth=20))
    Spline36Resize(1440,1080)

    The "good" in screenshot 3 is the chroma meets the lines (or gets much closer) without messing with them too much. You can see it in the orange in her hair, or the green, which doesn't bleed onto her arm as much.
    The "bad" is maybe the chroma sharpening is a little over-done. The line on the left side of Peter's face looks a bit orange in screenshot 3. Depth=10 might have been enough. Still, I'd rather live with that. Everything in video filtering seems to be some sort of a compromise.

    I assume MergeChroma(aWarpSharp2(Depth=19)) in your script is doing much the same thing, except the luma is being taken from the source clip after de-interlacing or IVTC, rather than a clip with AwarpSharp(Depth=5) applied. Depth=5 is pretty mild, so the difference wouldn't be huge.
    Image Attached Thumbnails Click image for larger version

Name:	1.png
Views:	79
Size:	800.3 KB
ID:	48664  

    Click image for larger version

Name:	2.png
Views:	55
Size:	683.6 KB
ID:	48665  

    Click image for larger version

Name:	3.png
Views:	70
Size:	716.9 KB
ID:	48666  

    Last edited by hello_hello; 15th Apr 2019 at 09:01.
    Quote Quote  
  20. Originally Posted by LouieChuckyMerry View Post
    What's up with that? I let a single episode run overnight and MediaInfo shows 23.976 fps but the time of the episode seems to have about doubled and it does play in slow motion.
    You should have used
    QTGMC(FPSDivisor=2)
    AssumeFPS(24000,1001)

    AssumeFPS just changes the frame rate without changing the number of frames, so without FPSDivisor=2 you slowed a 50fps clip down to 23.976fps.
    Quote Quote  
  21. Originally Posted by hello_hello View Post
    Originally Posted by LouieChuckyMerry View Post
    After my last post (see the EDIT) I ran a test with:

    Code:
    ### Deinterlace-Match Fields-Decimate ###
    QTGMC()
    SRestore(23.976)
    and the rest of the script the same, and the results were very interesting; the horizontal artifacts--please, what are they called--were still there but the single bad frame was gone. Then I ran the same script but with the deinterlacing before the color conversion, and the horizontal artifacts were gone but the single bad frame was back. This I don't understand .
    SRestore normally looks for blended frames and tries to keep the non-blended ones, but for a field blended source the blending usually happens in a fairly consistent pattern.
    Your sample isn't field blended (apart from a couple of frames) so to reduce the frame rate SRestore needs to delete "some" frames, and probably for a non-blended source, duplicate frames would be removed in preference to non-duplicates, but there's no blending pattern as such, which is why I said you probably got lucky the first time, and there'd have to be some sort of a glitch in motion where the blended fields are, because there's no clean fields to use.
    Yep, more tests and AssumeFPS(24000,1001) certainly trumps SRestore(23.976). Thanks (and manolo, too).


    Originally Posted by hello_hello View Post
    Originally Posted by LouieChuckyMerry View Post
    Why do you "repeat" the aWarpSharp call?
    Probably because jagabo told me to
    So so true .


    Originally Posted by hello_hello View Post
    Because they're used inside MergeChroma, the first instance is only sharpening the luma (it's sharpening luma and chroma, but MergeChroma takes the chroma from the second instance), so by using only mild sharpening it's not doing horrible things to lines. The second instance can go to town on the chroma.
    Thank you for the clear explanation.


    Originally Posted by hello_hello View Post
    Originally Posted by LouieChuckyMerry View Post
    What's up with that? I let a single episode run overnight and MediaInfo shows 23.976 fps but the time of the episode seems to have about doubled and it does play in slow motion.
    You should have used
    QTGMC(FPSDivisor=2)
    AssumeFPS(24000,1001)

    AssumeFPS just changes the frame rate without changing the number of frames, so without FPSDivisor=2 you slowed a 50fps clip down to 23.976fps.
    Thanks for the reminder. I tinkered with QTGMC way back when, but that initial testing left me preferring SMDegrain and I've not used QTGMC since.
    Last edited by LouieChuckyMerry; 16th Apr 2019 at 22:41.
    Quote Quote  
  22. Given the earlier kind advice from hello_hello and manolo to focus on TFM().AssumeFPS(24000,1001), I've run many more tests, the results of which, at least to me, are really interesting. Tinkering with TFM's "Mode" and "PP" settings, I've found that "Mode=5" combined with "PP=3", "PP=4", and "PP=6" (the default) by far render the best results for this source. The differences are subtle, but mostly can be seen in (I think it's called) combing around the mouths of characters as they speak. The biggest discovery, which I stumbled upon, was to set "UBSCO=False"; this made everything markedly better, especially scene changes. Ahhh, all of this is based on a longer sample clip, 45s as opposed to 10s, which encompasses the original sample but adds more action and pans. It's here: S1.E1-FuturamaSample[PAL][Extended]
    Quote Quote  
  23. Of course if you want to use the NTSC audio, you've still got to drop it in and hope it syncs up. The chances of that aren't huge. Often they're edited slightly differently, sometimes it's an extra frame at the end of a scene here.... one less frame there.... and before you know it you'll be editing and re-encoding the audio.

    Have you tried mode=7?
    UBSCO=False does seem to improve the matches occasionally, but I think sometimes mode=7 does better than mode=5.
    It needs linear access, so you have to play the frames in order for it to work properly, but here's a couple of examples.

    The first three screenshots are: TFM(mode=5, pp=3, UBSCO=False)
    The next three screenshots are: TFM(mode=7, pp=3, UBSCO=False)

    Once again it's something of a compromise. Where mode=7 does find a clean frame, sometimes there's a little "glitch" in the motion instead of blending. You'd have to decide for yourself which you dislike the most. I could live with the blending where there's a lot of motion as it goes by quickly and doesn't stand out.
    Image Attached Thumbnails Click image for larger version

Name:	1.jpg
Views:	73
Size:	86.3 KB
ID:	48686  

    Click image for larger version

Name:	2.jpg
Views:	50
Size:	88.3 KB
ID:	48687  

    Click image for larger version

Name:	3.jpg
Views:	41
Size:	91.8 KB
ID:	48688  

    Click image for larger version

Name:	4.jpg
Views:	45
Size:	84.6 KB
ID:	48689  

    Click image for larger version

Name:	5.jpg
Views:	59
Size:	91.5 KB
ID:	48690  

    Click image for larger version

Name:	6.jpg
Views:	77
Size:	90.3 KB
ID:	48691  

    Last edited by hello_hello; 17th Apr 2019 at 02:02.
    Quote Quote  
  24. Originally Posted by hello_hello View Post
    Have you tried mode=7?
    UBSCO=False does seem to improve the matches occasionally, but I think sometimes mode=7 does better than mode=5.
    It needs linear access, so you have to play the frames in order for it to work properly, but here's a couple of examples.

    The first three screenshots are: TFM(mode=5, pp=3, UBSCO=False)
    The next three screenshots are: TFM(mode=7, pp=3, UBSCO=False)

    Once again it's something of a compromise. Where mode=7 does find a clean frame, sometimes there's a little "glitch" in the motion instead of blending. You'd have to decide for yourself which you dislike the most. I could live with the blending where there's a lot of motion as it goes by quickly and doesn't stand out.
    Given that the TIVTC/TCM AviSynth Wiki states (emphasis mine) "Mode 7 is is not one of the normal modes and is specifically for material with blended fields that follows a specific pattern." I didn't try Mode=7. Silly me for not understanding what "specific pattern" means. Anyway, a couple quick test with Mode=7 are very promising--a single bad frame passes faster than several--so I'll run some longer tests with various PP's and compare them to my Mode=5 tests. Many thanks for the suggestion.


    Originally Posted by hello_hello View Post
    Of course if you want to use the NTSC audio, you've still got to drop it in and hope it syncs up. The chances of that aren't huge. Often they're edited slightly differently, sometimes it's an extra frame at the end of a scene here.... one less frame there.... and before you know it you'll be editing and re-encoding the audio.
    I'm planning to encode the entire first episode overnight then check the audio sync tomorrow. Any idea how to upmix 192 Kbps two-channel .ac3 audio to 768 Kbps six-channel DTS? Seriously, if it comes to it I'd be inclined to live with the PAL audio given how much better the video looks compared to the NTSC.
    Last edited by LouieChuckyMerry; 17th Apr 2019 at 08:55. Reason: Correction
    Quote Quote  
  25. For your first sample, Mode=7 didn't help. Trying it on your second sample was just an experiment. I was surprised it helped myself.

    Because surround sound sucks and blows at the same time, I've never tried to upmix audio.
    MeGUI has a couple of upmixing options in it's audio encoder configuration. I think they use SoX to upmix. I don't know if there's alternative/better/free methods.
    Quote Quote  
  26. Something you might consider, assuming the blending problem only occurs in a few places, is to create two versions of the video. I used my previous script below, except "B" is the Mode=7 clip and "C" is Mode=5.
    Having to give Mode=7 linear access makes it harder, but you could run a quick encode with Mode=7, with none of the other filtering, then check the blended spots for problem frames. If you see any bad frames you could make note of the frame numbers, then switch the script to "C" and check if Mode=5 did a better job. If so, you could use Trim() to replace those frames with the frames from the mode=5 clip, and then run the final encode with filtering.

    Anyone know if there's a function for checking for blended frames that can provide a list of frame numbers?. That'd be better. Not having to check the Mode=7 version of the clip manually. Anyway, it's just a thought...

    Code:
    A = last
    B = A.TFM(mode=7, pp=3, UBSCO=False)
    C = A.TFM(mode=5, pp=3, UBSCO=False)
    
    B.Trim(0,0)
    # C.Trim(0,0)
    
    TTempSmooth()
    FastLineDarken(Thinning=0)
    MergeChroma(AwarpSharp(Depth=5), AwarpSharp(Depth=20))
    CropResize(960,720,8,2,-8,-2,InDAR=15.0/11.0,Resizer="Spline36")
    FastLineDarken(Thinning=0)
    MAA()
    DeHalo_alpha()
    CSMod(strength=150)
    GradFun3(thr=1.0, thrc=1.0)
    f3kdb()
    AssumeFPS(24000,1001)
    Last edited by hello_hello; 18th Apr 2019 at 07:24.
    Quote Quote  
  27. Originally Posted by hello_hello View Post
    Because surround sound sucks and blows at the same time, I've never tried to upmix audio. MeGUI has a couple of upmixing options in it's audio encoder configuration. I think they use SoX to upmix. I don't know if there's alternative/better/free methods.
    My PAL Video + NTSC Audio fiddling proved way too annoying to actually do for 72 episodes. "AssumeFPS(23.976)" seemed to work fine, but the final audio and video lengths were off by five seconds, which would be simple enough to solve with MKVToolNix's "Stretch by:" feature but for the ad breaks being different. So, the audio starts in sync and remains in sync until the first ad break then goes out of sync. Basically, I'd need to split things into three parts--first X minutes, middle X minutes (between the two ad breaks), final X minutes--then sync each of these segments and put them back together. I realllllly like Futurama, but I'll stick with the stereo audio for now and look into upmixing later if I get the proverbial corn. Thanks for the MeGUI-SoX info.


    Originally Posted by hello_hello View Post
    For your first sample, Mode=7 didn't help. Trying it on your second sample was just an experiment. I was surprised it helped myself... Something you might consider, assuming the blending problem only occurs in a few places, is to create two versions of the video. I used my previous script below, except "B" is the Mode=7 clip and "C" is Mode=5.
    Having to give Mode=7 linear access makes it harder, but you could run a quick encode with Mode=7, with none of the other filtering, then check the blended spots for problem frames. If you see any bad frames you could make note of the frame numbers, then switch the script to "C" and check if Mode=5 did a better job. If so, you could use Trim() to replace those frames with the frames from the mode=5 clip, and then run the final encode with filtering.

    Anyone know if there's a function for checking for blended frames that can provide a list of frame numbers?. That'd be better. Not having to check the Mode=7 version of the clip manually. Anyway, it's just a thought...

    Code:
    A = last
    B = A.TFM(mode=7, pp=3, UBSCO=False)
    C = A.TFM(mode=5, pp=3, UBSCO=False)
    
    B.Trim(0,0)
    # C.Trim(0,0)
    
    TTempSmooth()
    FastLineDarken(Thinning=0)
    MergeChroma(AwarpSharp(Depth=5), AwarpSharp(Depth=20))
    CropResize(960,720,8,2,-8,-2,InDAR=15.0/11.0,Resizer="Spline36")
    FastLineDarken(Thinning=0)
    MAA()
    DeHalo_alpha()
    CSMod(strength=150)
    GradFun3(thr=1.0, thrc=1.0)
    f3kdb()
    AssumeFPS(24000,1001)
    All my testing led to "TFM(Mode=7,PP=6(This Is The Default),UBSCO=False)" being the way to go with regards to speed and quality. I found no differences between "PP=3", "PP=4", and "PP=6" other than encoding speed; this isn't to state unequivocally that there's no difference in quality, only that I didn't see any over the course of my testing. And, "Mode=7" left noticeably fewer glitches than "Mode=5"; a place where "Mode=5" might output three consecutive bad frames, "Mode=7" would typically output but a single bad frame. Having encoded the first dozen episodes and poked through them enough to check the audio sync, I spotted a couple-three places where there were double-digit bad frames in a row--a quick burst of action with a dozen frames blended or a character's mouth with visible combing for a second--but overall the results are much much better than with my NTSC DVD's. If these spots prove to be unbearable, then I'll use a combination of AviSynth's Trim, Avidemux, and MKVMerge to cut out the offensive bits, fix the glitch, and put things back together.

    I was also, finally, able to give your script a look; it took a while to gather all the dependencies. Coincidentally, CropResize was the only thing I couldn't get to work, something about no function named 'Resize8' despite the presence of AutoCrop.dll, so I went with The jagabo Upscaling Method© (jUM©). With regards to speed, your script and my script were within a hundredth of a frame per second but I prefer the results of mine: S1.E1-FuturamaScreenshotComparison[h_h&LCM]. Of course, I'm always open to suggestions for improvement. As of now the script is:

    Code:
    SOURCE INFORMATION HERE
    ### Deinterlace-Match Fields-Decimate ###
    TFM(Mode=7,UBSCO=False)
    ### Color Conversion ###
    ColorMatrix(Mode="Rec.601->Rec.709")
    ### Adjust Color ###
    MergeChroma(aWarpSharp2(Depth=5),aWarpSharp2(Depth=19))
    SmoothTweak(Saturation=1.01)
    ### Crop ###
    Crop(8,0,-8,0)
    ### Gibbs Noise Block ###
    Edge=MT_Edge("prewitt",ThY1=20,ThY2=40).RemoveGrain(17)
    Mask=MT_Logic(Edge.MT_Expand().MT_Expand().MT_Expand().MT_Expand(),Edge.MT_Inflate().MT_Inpand(),"xor").Blur(1.0)
    MT_Merge(Minblur(),Mask,Luma=True)
    ### Overall Temporal Denoise ###
    SMDegrain(TR=2,ThSAD=400,ContraSharp=True,RefineMotion=True,Plane=0,PreFilter=2,Chroma=False,Lsb=True,Lsb_Out=False)
    ### Resize ###
    NNEDI3_RPow2(4,CShift="Spline64Resize",FWidth=960,FHeight=720)
    aWarpSharp2(Depth=5)
    Sharpen(0.2)
    ### Darken-Thin Lines ###
    Dither_Convert_8_To_16()
    F=DitherPost(Mode=-1)
    S=F.FastLineDarkenMod(Strength=24,Prot=6).aWarpSharp2(Blur=4,Type=1,Depth=3,Chroma=2)
    D=MT_MakeDiff(S,F).Dither_Convert_8_To_16()
    Dither_Add16(Last,D,Dif=True,U=2,V=2)
    ### Deband ###
    GradFun3(Radius=16,ThR=0.55,SMode=2,StaticNoise=True,Lsb_In=True,Lsb=True)
    ## Trim()
    # SelectRangeEvery(1000,66)
    DitherPost()
    Last edited by LouieChuckyMerry; 22nd Apr 2019 at 11:48. Reason: Continuity
    Quote Quote  
  28. Happy Friday . Upon accessing an HDTV I learned that the above script could be quite a bit better:

    1) The Adjust Color block had a couple problems. The MergeChroma line looked nicer (at least to me) with a single aWarpSharp2 call--less blurry; more detailed--and the SmoothTweak line was overkill, I think because of the addition of the Color Conversion line.

    2) I lowered the strength of SMDegrain to "TR=2,ThSAD=200".

    3) After much testing I learned that, in conjunction with the change in the MergeChroma line, the Darken-Thin Lines block looked noticeably better with "Depth=9". Everything--background and foreground--seemed clearer, and the edges are very much cleaner.

    There's always room for improvement, but the following looks pretty good to me and, as of now , is The Final Futurama Upscaling Script:

    Code:
    SOURCE INFORMATION HERE
    ### Deinterlace-Match Fields-Decimate ###
    TFM(Mode=7,UBSCO=False)
    ### Color Conversion ###
    ColorMatrix(Mode="Rec.601->Rec.709")
    ### Adjust Color ###
    MergeChroma(aWarpSharp2(Depth=16))
    ### Crop ###
    Crop(8,0,-8,0)
    ### Gibbs Noise Block ###
    Edge=MT_Edge("prewitt",ThY1=20,ThY2=40).RemoveGrain(17)
    Mask=MT_Logic(Edge.MT_Expand().MT_Expand().MT_Expand().MT_Expand(),Edge.MT_Inflate().MT_Inpand(),"xor").Blur(1.0)
    MT_Merge(Minblur(),Mask,Luma=True)
    ### Overall Temporal Denoise ###
    SMDegrain(TR=2,ThSAD=200,ContraSharp=True,RefineMotion=True,Plane=0,PreFilter=2,Chroma=False,Lsb=True,Lsb_Out=False)
    ### Resize ###
    NNEDI3_RPow2(4,CShift="Spline64Resize",FWidth=960,FHeight=720)
    aWarpSharp2(Depth=5)
    Sharpen(0.2)
    ### Darken-Thin Lines ###
    Dither_Convert_8_To_16()
    F=DitherPost(Mode=-1)
    S=F.FastLineDarkenMod(Strength=24,Prot=6).aWarpSharp2(Blur=4,Type=1,Depth=9,Chroma=2)
    D=MT_MakeDiff(S,F).Dither_Convert_8_To_16()
    Dither_Add16(Last,D,Dif=True,U=2,V=2)
    ### Deband ###
    GradFun3(Radius=16,ThR=0.55,SMode=2,StaticNoise=True,Lsb_In=True,Lsb=True)
    ## Trim()
    # SelectRangeEvery(1000,66)
    DitherPost()
    Thanks again to all who helped .
    Last edited by LouieChuckyMerry; 3rd May 2019 at 08:08. Reason: Clarity
    Quote Quote  
  29. Ahhhh... two threads. I thought I was losing my mind....

    I had a look, gave it serious consideration, weighed the pros and cons, and after extensive deliberating and soul searching, I decided you're wrong about Depth=9. I'm not saying it looks bad, but Depth=5 is truer to the original.

    By the way, I'm pretty sure failure to crop the half lines top and bottom will result in the video police arresting you for reckless encoding.

    And it's probably a nit-pick, but I doubt it'd hurt to turn up the debanding, just a bit. I don't know if re-encoding will increase the banding again, but looking at your script output, there's a few places where it's right on the edge... so to speak.

    A few screenshots. If you're wanting to be true to the original, I think my simple script is closer, but each to their own. I took the screenshots with MPC-HC running each full screen to make it fair. Saving them as jpg helped emphasise what I meant about the colour banding though.... if you look closely, it's starting to appear again in screenshot #1, courtesy of the jpg compression, which is from your script.

    #2 is from my script and #3 is the DVD video

    PS. My CropResize function only requires AutoCrop.dll when auto-cropping is specifically enabled, unless you downloaded a very early version of the script.
    Image Attached Thumbnails Click image for larger version

Name:	you.jpg
Views:	78
Size:	829.9 KB
ID:	48969  

    Click image for larger version

Name:	me.jpg
Views:	88
Size:	1.13 MB
ID:	48970  

    Click image for larger version

Name:	DVD.jpg
Views:	76
Size:	682.0 KB
ID:	48973  

    Last edited by hello_hello; 3rd May 2019 at 16:41.
    Quote Quote  
  30. hello_hello: thanks for your feedback; I always appreciate it .


    Originally Posted by hello_hello View Post
    Ahhhh... two threads. I thought I was losing my mind....
    Please don't blame me .


    Originally Posted by hello_hello View Post
    I had a look, gave it serious consideration, weighed the pros and cons, and after extensive deliberating and soul searching, I decided you're wrong about Depth=9. I'm not saying it looks bad, but Depth=5 is truer to the original.
    I completely agree that "Depth=5" is truer to the original; however, my objective is to improve on the original. "Depth=9" looked way better to me on an HDTV--I found the fat black lines a bit distracting--but since I have to start the encoding process over again anyway because of The Debanding Issue you kindly pointed out, I'm willing to go with "Depth=8".


    Originally Posted by hello_hello View Post
    By the way, I'm pretty sure failure to crop the half lines top and bottom will result in the video police arresting you for reckless encoding.
    Are the video police the same as the dream police? Bun E Carlos IV is not happy about this...


    Originally Posted by hello_hello View Post
    And it's probably a nit-pick, but I doubt it'd hurt to turn up the debanding, just a bit. I don't know if re-encoding will increase the banding again, but looking at your script output, there's a few places where it's right on the edge... so to speak.
    That's not a nit-pick, and zero argument here; I was so focused on the lines that I forgot about the banding. How would I "turn up the debanding just a bit" in my script?


    Originally Posted by hello_hello View Post
    A few screenshots. If you're wanting to be true to the original, I think my simple script is closer, but each to their own. I took the screenshots with MPC-HC running each full screen to make it fair. Saving them as jpg helped emphasise what I meant about the colour banding though.... if you look closely, it's starting to appear again in screenshot #1, courtesy of the jpg compression, which is from your script.
    Honestly, I think a combination of the scripts would be great. I like your trueness to the original but I prefer my less blurriness. And what is "colour"? Do you mean "color"?


    Originally Posted by hello_hello View Post
    PS. My CropResize function only requires AutoCrop.dll when auto-cropping is specifically enabled, unless you downloaded a very early version of the script.
    As far as I know--not very, to be fair--I've the newest version of CropResize. The error message was something about Resize_8 being an unknown function (the specific error message is in an earlier post).
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!