VideoHelp Forum
+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 47
Thread
  1. Edit: Despite the title, I am not sure if this is an NTSC->PAL conversion, because of the origin of the animation. However, it's PAL and the fields are blended and there are ghosting artifacts.
    Another noobs help request
    First up: some clips: https://drive.google.com/drive/folders/1g1eKGFHWsokzTmcdkcNGbG8UUjyGWPM5?usp=sharing

    The show is "Skyland", it first aired in Canada but I watched it on TV in Australia. The DVD is Australian (distributed by Madman Entertainment).
    It's fully interlaced, 25fps, 720x576i.

    Here are some raw fields:

    Click image for larger version

Name:	HLVIJp0.png
Views:	205
Size:	299.3 KB
ID:	53749
    Click image for larger version

Name:	k0ZUx6X.png
Views:	190
Size:	438.9 KB
ID:	53750
    Click image for larger version

Name:	6vge7xy.png
Views:	189
Size:	314.0 KB
ID:	53751


    I'm not the most experienced and was wondering what I could do beyond qtgmc+srestore (and what the best settings for those 2 filters are in this case). If someone could take a look I'd appreciate it. The blending is pretty bad and I know NTSC->PAL conversions ditch 1 in 6 frames (it's not NTSC->PAL), so I can understand if it's not really rescuable. Thanks in advance, I know this stuff isn't simple and kudos to the users who frequently help users on this forum .
    Last edited by nacho; 12th Jun 2020 at 01:44.
    Quote Quote  
  2. When it comes to PAL sources, I typically use QTGMC and Srestore.

    These QTGMC settings will keep a lot more detail than the default settings.

    You can make it 23.976fps
    Code:
    QTGMC(preset="slow", matchpreset="slow", matchpreset2="slow", sourcematch=3, tr1=2, tr2=1, NoiseTR=2, sharpness=0.1)
    srestore(frate=23.976, speed=1)
    lanczosresize(852,480)
    If that framerate introduced jittering when the entire screen is panning/moving in a direction, then use 25fps like the source material. The 23.976 looks like it cleaned it a little more though.

    Code:
    QTGMC(preset="slow", matchpreset="slow", matchpreset2="slow", sourcematch=3, tr1=2, tr2=1, NoiseTR=2, sharpness=0.1)
    srestore(frate=25, speed=1)
    lanczosresize(852,480)

    I honestly have no idea how to get rid of the rest of all that leftover stuff. Theres just too many of those frames with that garbage in it in a row. Even without deinterlacing, I can see that stuff. Deinterlacing and srestore removing blended frames is not going to catch all of that on its own. QTGMC might have a setting that can help with those a bit more, but it will most likely be destructive to other things like detail due to being an artifact cleaner.
    Last edited by killerteengohan; 11th Jun 2020 at 03:51.
    Quote Quote  
  3. Yeah you're right sadly this source was kinda ruined with all those blends. Why have you said to resize to 852x480 (480p) and not 1024x576, the source has 576 lines. Also why 852 and not 854?
    Quote Quote  
  4. Because that aspect ratio in NTSC is 853x480. I usually crop two from the sides to 849x480 and make 848x480 to make it MOD16 resolution. Since its PAL and I didn't recall the resolution PAL uses for that aspect, I just used NTSC aspect for the example is all dut to being in a hurry and not remembering the PAL resolution. The source was most likely animated in 480p resolution anyways, and put on DVD as a PAL 576p upscale. It depends on how old that series is, and where it originated from. You don't have to use 480p resolution if you don't want.

    I chose 852 instead of 854 because 852 is a MOD4 resolution and anything less than that is not the best choice for playback on devices, especially if subtitles are going to be used. 854 is a MOD2 resolution and can give playback issues on some devices. You don't really wanna go below MOD4 if you can help it. I've had video turn green in media players from using MOD2 with subtitles. It worked fine until subtitles were turned on, then it went green. It might not be as big of issue as it used to be with more modern players, but for compatibility, I would stick with MOD4, MOD8, or MOD16 resolutions. You can use MOD2 if you want, it will still play, but has the possibility of some issues. MOD4+ will give you less possible issues in the long run. If you want MOD2, and it plays back fine for you on the equipment your watching it on, then by all means use MOD2.

    1024x576 is the correct PAL resolution to use if you want it to be 576p. It's a MOD16 resolution, and should give you no trouble at all with playback, even on older devices.
    Last edited by killerteengohan; 11th Jun 2020 at 06:02.
    Quote Quote  
  5. I'm not sure on the source, wikipedia says it was "developed in France in partnership with Canada and Luxembourg". "The show is a co-production between Paris's Method Films and Toronto's 9 Story Entertainment", so a collaboration between Canadian and French studios.

    It was aired on:
    France 2 - France
    ABC - Australia
    CITV - Britain
    Which are channels in PAL regions

    But also:
    Teletoon - Canada
    Nicktoons - USA
    Which are channels in NTSC regions

    I've searched and I don't believe a Region 1 DVD even exists. There's French DVDs: here and here
    But it's hard to know if they would be different to the Australian one - do you have any thoughts on whether they'd be any better?

    Edit: Back to the processing - should it be restored back to 23.976 or to 29.97 and how would I determine that? Also srestore has parameters which I don't really understand. Would changing omode benefit this? I'm not sure what a double-blend is or it what it looks like. I also read that with blends like this it might just be better to watch it at 50fps as, in theory, the blends are visible for less time.
    Last edited by nacho; 11th Jun 2020 at 10:31.
    Quote Quote  
  6. You do not make PAL 25fps into 29.97fps. You either leave it at 25fps or decimate to 23.976fps. If you want to make it 29.970fps you will have to add in more blended frames or duplicates, and make it look even worse when watching it. It may even become jerky as a result of this.

    You can determine that by its origin and by what you visibly see.
    If it was created and originally animated for PAL region at 25fps, then 25fps is most likely what it is and you usually will have no ghosting/blending in it.
    If it was created and originally animated for NTSC region, then you would decimate out the blended frames they added in to make it 25fps for PAL if you see blending/ghosting in the 25fps.
    If decimating it to 23.976fps made the screen jittery when the whole screen pans in a direction, then leave it at 25fps. (This one will give your answer usually after you see how srestore works on it)
    If its 25fps and has no blending/ghosting what so ever in it, then leave it alone at 25fps.
    Quote Quote  
  7. Okay excellent, thanks for all the info. I'll watch it at both 23.976 and 25 and see how it looks. I suspect either way there'll be blending/ghosting artifacts still.
    Quote Quote  
  8. It's supposed to be 23.976fps. It's an especially bad film to PAL transfer.
    Quote Quote  
  9. Thanks manono, I just had in my mind that the process must have been more complex than just film->PAL to create that much junk - have you had experience with madman DVDs before? I was just reading this thread/post as you replied, so I might try those srestore settings suggested by ndjamena.

    Edit: There are still scene-change blends getting through
    Now using (I use vapoursynth)
    Code:
    sres = haf.srestore(ditl,frate=23.976, speed=-1, thresh=6, mode=-2)
    as my srestore line, I wrote down a list of 5 scene-change blends (that existed after default srestore) and these settings fixed 2 of them.
    Last edited by nacho; 12th Jun 2020 at 03:41.
    Quote Quote  
  10. Yes, I have worked on multiple MadMan DVD's. They are not all the same. None were this bad, but multiple did have too many blends to remove all of them without creating jittering.

    Originally Posted by nacho View Post
    Okay excellent, thanks for all the info. I'll watch it at both 23.976 and 25 and see how it looks. I suspect either way there'll be blending/ghosting artifacts still.
    You're not going to get all of them out of that source without taking out too many frames, replacing several frames, or destroying the detail and possibly your entire video, with an overly strong artifact cleaner/remover.

    I suppose if you wanted to put in the extra work, you could save every single individual frame as PNG images with software, and then edit them one by one in a photo editor like photoshop and clean up every single frame with those in them. Then put them all back together as a video again. Prepare for MANY hours of work though if you wish to go that route. I personally dont think the amount of work it would take is worth it, but that shows worth to you may be enough to do so. If something need that many hours of work, I just wait for a better release or find a different source most of the time. If I don't, then I do the best I can with what I have, and am happy enough just knowing its better than the source was, even if its not perfect. While you may still have some blends leftover, you also fixed and removed several of them so there is now much less than before, so its better than what you had.
    Last edited by killerteengohan; 12th Jun 2020 at 20:28.
    Quote Quote  
  11. Nope, don't think I'll be doing that....

    I am hoping to be able to remove these scene change blends, well I think that's the right terminology... They seem to occur in pairs, the frame before the scene change has the next frame blended into it, and the frame that's in the new scene has the previous one blended into it. Here are 3 examples, screenshots are taken after QTGMC. Srestore with the settings i adjusted above removes both of the blends from the first case, in the other 2 it only removes the first blend, the second remains.
    Image Attached Thumbnails Click image for larger version

Name:	skyland_simple.vpy - 19198.png
Views:	106
Size:	1.19 MB
ID:	53775  

    Click image for larger version

Name:	skyland_simple.vpy - 19199.png
Views:	78
Size:	1.19 MB
ID:	53776  

    Click image for larger version

Name:	skyland_simple.vpy - 19922.png
Views:	52
Size:	1.19 MB
ID:	53777  

    Click image for larger version

Name:	skyland_simple.vpy - 19923.png
Views:	74
Size:	1.19 MB
ID:	53778  

    Click image for larger version

Name:	skyland_simple.vpy - 24264.png
Views:	56
Size:	1.19 MB
ID:	53779  

    Click image for larger version

Name:	skyland_simple.vpy - 24265.png
Views:	86
Size:	1.19 MB
ID:	53780  

    Quote Quote  
  12. There are AviSynth filters that detect scene changes and replace them with a copy of the frame before or after. Or sometimes you'll find that you can replace just the chroma.

    I think it's johnmeyer or manono that's mentioned them many times.
    Last edited by jagabo; 12th Jun 2020 at 22:55.
    Quote Quote  
  13. You forget already? You wrote most of the filter after I explained what I was trying to do. And I use it almost daily (very bad scene changes because of field-blended sources).

    Here is FreezeFrameMC:

    #FreezeFrame(clip clip, int first-frame, int last-frame, int source-frame)

    ### This function FreezeFrames 1, 2, 3 or 4 frames either before (B) or after (A) a scene change
    ### Then one appropriate frame will be interpolated to get rid of the slight 'stutter'
    ### duplicate frames might create. The interpolater is Gavino's 'FixBadFRames' filter,
    ### included here. Required are MVTools2.dll, RemapFrames.dll, and the FreezeFrameMC.avs.
    ### Much thanks go out to jagabo at videohelp.com who pointed me in the right direction
    ### when I immediately got stuck (having never written a function before).

    ###Usage FFA2(242) - Freezes 2 frames after a scene change beginning with frame number 242 immediately after the change with the second one interpolated
    ###Usage FFB1(397) - Freezes and interpolates 1 frame before a scene change using frame number 397, the one just before the scene change



    function FixBadFrames(clip c, string frames) {
    # Replace each frame from a list of 'bad' frames by using MFlowInter to interpolate
    # between the nearest pair of 'good' frames
    c
    sup = MSuper()
    bv = MAnalyse(sup, isb=true, delta=2)
    fv = MAnalyse(sup, isb=false, delta=2)
    candidates = MFlowInter(sup, bv, fv, time=50.0, ml=100).DuplicateFrame(0)
    ReplaceFramesSimple(candidates, mappings=frames)
    }

    #Usage FixBadFrames("34 64 96")

    ###Freezes the two frames before a scene change and then interpolates the first of the two
    function FFB1(clip Source, int N)
    {
    FreezeFrame(Source, N-1,N,N-1)
    FixBadFrames(String(N-1))
    }

    ###Freezes the two frames after a scene change and then interpolates the second of the two
    function FFA1(clip Source, int N)
    {
    FreezeFrame(Source, N,N+1,N+1)
    FixBadFrames(String(N+1))
    }

    ###Freezes the three frames before a scene change and then interpolates the first of the three
    function FFB2(clip Source, int N)
    {
    FreezeFrame(Source, N-2,N,N-2)
    FixBadFrames(String(N-2))
    }

    ###Freezes the three frames after a scene change and then interpolates the third of the three
    function FFA2(clip Source, int N)
    {
    FreezeFrame(Source, N,N+2,N+2)
    FixBadFrames(String(N+2))
    }

    ###Freezes the four frames before a scene change and then interpolates the fourth of the four
    function FFB3(clip Source, int N)
    {
    FreezeFrame(Source, N-3,N,N-3)
    FixBadFrames(String(N-3))
    }

    ###Freezes the four frames after a scene change and then interpolates the fourth of the four
    function FFA3(clip Source, int N)
    {
    FreezeFrame(Source, N,N+3,N+3)
    FixBadFrames(String(N+3))
    }


    This thing is entirely manual, though. If you want an automatic one, it'll FreezeFrame both before and after every scene change, whether you want it to or not. You have to choose how strong you want it. Too strong and it might work where you don't want it to:

    # freeze before+after scenechange. Needs v0.9 of RemoveDirt.dll
    prev = restore.selectevery(1,-1)
    next = restore.selectevery(1,1)
    SCclean = restore.SCSelect(next,prev,restore,dfactor=2.0) # 2.0 ~ 5.0

    return(SCclean) # return(restore) for NO scenechange cleanup


    Didée at Doom9 made this one once upon a time.
    Quote Quote  
  14. So if my scene changes go like this:

    O = old scene
    N = new scene
    b = blended

    Code:
    +---+---+----+----+---+---+
    | O | O | Ob | Nb | N | N |
    +---+---+----+----+---+---+
    | 1 | 2 | 3  | 4  | 5 | 6 |
    +---+---+----+----+---+---+
    Where Ob has Nb blended into it, and Nb has Ob blended into it. It seems to me that your script doesn't deal with this case, as it's 1 frame before and 1 frame after the scene change? Or am I misunderstanding how FFA/FFB functions work?

    Edit: Ah I think I see, you need 2 calls for every scene change: FFB1(3) and FFA1(4) in my example above, is this right?
    Quote Quote  
  15. Again, it's manual. FFB1(frame#) takes the frame before the scene change and tosses (removes, deletes) it. It uses the frame before that as the new frame before the scene change and the frame before that is a new frame interpolated from the 2 frames on either side.
    ...you need 2 calls for every scene change: FFB1(3) and FFA1(4) in my example above, is this right?
    Yes, in my own experience I don't always need to fix the frames on both sides of the scene change. I'm sure it can be rewritten for your own needs - fix both sides of a scene change in one call. You say you only need to fix a single frame while I often need to use it on as many as 3 or 4 frames before or after a scene change. If you always want them both fixed, you can try the second little script I showed. It replaces the bad frames at the scene change with copies of the previous and next frames. It should work pretty well with animations.
    Quote Quote  
  16. Okay perhaps I'll dial back the srestore threshold and then try and get the scene changes with that SCselect script. I'm just worried it will dupe frames I don't want duped.. then if I still don't like what that does I will try get scene changes with maybe scxvid and manually fix the ones I want with freezeframesmc.

    Edit: Hang on a minute, I don't want duped frames at every scene change. Only troublesome blended ones, looks like I'm going the more manual route.

    I'm trying to write your script out in python/vapoursynth, in this line in FixBadFrames:
    Code:
    candidates = MFlowInter(sup, bv, fv, time=50.0, ml=100).DuplicateFrame(0)
    There are only 3 clips passed to MFlowInter but it takes 4, is it just using 'c' as the source clip? Sorry, I started with vapoursynth and never learnt avisynth.
    I worked it out, when c is written then Last=c and whenever a clip arg is missing it will use Last (i.e. c).

    One other question the DuplicateFrame(0) causes vapoursynth ReplaceFramesSimple to error: "Clip lengths don't match". Do clip lengths not need to match in the avisynth RFS or what's the reason for the DuplicateFrame(0)? I don't actually understand why duplicate frame 0 is required.

    Here's what I have for it so far (with DuplicateFrames commented out)
    Code:
    def FixBadFrames(clp, frames):
    	sup = core.mv.Super(clip=clp)
    	bv = core.mv.Analyse(super=sup, isb=True, delta=2)
    	fv = core.mv.Analyse(super=sup, isb=False, delta=2)
    	candidates = core.mv.FlowInter(clip=clp, super=sup, mvbw=bv, mvfw=fv).std.DuplicateFrames(0)
    	candidates = candidates[0:candidates.num_frames-1]
    	return core.remap.Rfs(baseclip=clp, sourceclip=candidates, mappings=frames)
    And this for for using FreezeFrameMC on an arbitrary number of frames before and after a scene-change:
    Code:
    def FFMC(clp, before=0, after=0, sc=0):
    	if not isinstance(clp, vs.VideoNode):
    		raise vs.Error('FreezeFrameMC: This is not a clip')
    	
    	if sc == 0:
    		raise vs.Error('FreezeFrameMC: Specify non zero scene-change frame')
    	
    	if before and after:
    		frz = core.std.FreezeFrames(clip=clp, first=[sc-before-1, sc], last=[sc-1, sc+after], replacement=[sc-before-1,sc+after])
    		fix = FixBadFrames(frz, str(sc-before-1)+ ' ' + str(sc+after))
    	elif before:
    		frz = core.std.FreezeFrames(clip=clp, first=[sc-before-1], last=[sc-1], replacement=[sc-before-1])
    		fix = FixBadFrames(frz, str(sc-before-1))
    	elif after:
    		frz = core.std.FreezeFrames(clip=clp, first=[sc], last=[sc+after], replacement=[sc+after])
    		fix = FixBadFrames(frz, str(sc+after))
    	else:
    		raise vs.Error('FreezeFrameMC: Specify \'before\' or \'after\'')
    	
    	return fix
    Last edited by nacho; 13th Jun 2020 at 21:12.
    Quote Quote  
  17. For manual replacement you can use:

    Code:
    ######################################################
    
    function ReplaceFrameNext(clip Source, int N)
    {
      # Replace frame at N with frame at N+1
      # N is the frame to replace
      # with frame at N+1
    
      loop(Source, 0, N, N)
      loop(last, 2, N, N)
    }
    
    ######################################################
    
    function ReplaceFramePrev(clip Source, int N)
    {
      # Replace frame at N with frame at N-1
      # N is the frame to replace
      # with frame at N-1
    
      loop(Source, 0, N, N)
      loop(last, 2, N-1, N-1)
    }
    
    ######################################################
    Quote Quote  
  18. That freezeframe thing sounds nice for certain things. With it replacing frames with a duplicate of the previous or next one, cant things become skippy looking as it plays, from random longer pauses due to duplicate frames? I would think instead of a smooth motion, that it will quickly look like its buffering or freezing up, then continue to play, every time there is a replacement put in if it wasn't the same position.
    Quote Quote  
  19. Originally Posted by killerteengohan View Post
    That freezeframe thing sounds nice for certain things. With it replacing frames with a duplicate of the previous or next one, cant things become skippy looking as it plays, from random longer pauses due to duplicate frames? I would think instead of a smooth motion, that it will quickly look like its buffering or freezing up, then continue to play, every time there is a replacement put in if it wasn't the same position.
    Yes. But for single frames at scene changes it's not very noticeable. Especially with animated material which has lots of duplicates anyway.
    Quote Quote  
  20. Originally Posted by killerteengohan View Post
    That freezeframe thing sounds nice for certain things. With it replacing frames with a duplicate of the previous or next one, cant things become skippy looking as it plays, from random longer pauses due to duplicate frames? I would think instead of a smooth motion, that it will quickly look like its buffering or freezing up, then continue to play, every time there is a replacement put in if it wasn't the same position.
    Doesn't manonos interpolate a new frame to replace the duplicated one to elimate this?

    And jagabo this doesn't seem to have duplicate frames during motion, you're right it's probably almost unnoticeable but I'd still like to properly implement manonos interpolation thingo in vapoursynth.
    Last edited by nacho; 13th Jun 2020 at 09:40.
    Quote Quote  
  21. Originally Posted by nacho View Post
    Originally Posted by killerteengohan View Post
    That freezeframe thing sounds nice for certain things. With it replacing frames with a duplicate of the previous or next one, cant things become skippy looking as it plays, from random longer pauses due to duplicate frames? I would think instead of a smooth motion, that it will quickly look like its buffering or freezing up, then continue to play, every time there is a replacement put in if it wasn't the same position.
    Doesn't manonos interpolate a new frame to replace the duplicated one to elimate this?
    That won't work at scene changes though. And it doesn't work well with animated material, except for panning shots.
    Quote Quote  
  22. Why would it not work for scene changes? Manono even said he uses it daily for scene changes.

    Edit: Although I can see now the interpolation doesn't work well...
    Last edited by nacho; 13th Jun 2020 at 10:10.
    Quote Quote  
  23. I meant motion interpolation through a scene change doesn't work. Manono's motion interpolation before or after a scene change might look ok, depending on the material.
    Quote Quote  
  24. Hmm I just tried it: https://slow.pics/c/lXE3Ogw7

    So provided my implementation works as intended...this isn't really a viable option xD.
    Perhaps that call to duplicateframe is required to offset the interpolated clip by 1 frame for some reason? In that case I need to dup the first frame and delete the last one, to keep the clip the same length. Or just get the previous frame instead, which is easier..
    Last edited by nacho; 13th Jun 2020 at 21:03.
    Quote Quote  
  25. I believe manano's method duplicates the previous frame over the blended frame, then interpolates motion between that duplicate and two (or more ) frames before. So if N is the last frame of one shot, but with blending, it is replaced with a copy of N-1, then motion is interpolated between N-2 and the new N. The result is essentially slow (half speed ) motion for the last two frames. Then he has version that cover a larger number of frames, and frames after rather than before.
    Quote Quote  
  26. Your description of what it does is correct for FFA1 and FFB1. As was my earlier explanation. My explanation in the function itself is incorrect. I only noticed that after posting it. Used on live action for a single frame it's absolutely unnoticeable. Because, as you mentioned, the interpolation takes place at half-speed, I very very rarely notice any interpolation artifacts. It should be the same for animations because, as you mentioned, they have so many dupe frames they play pretty jerky to begin with.

    When I'm using it on multiple frames there will be dupe frames. FFB3, for example, will have 2 dupe frames. Gavino's FixBadFrames function used for the interpolation only works on a single frame.
    Originally Posted by nacho View Post
    Hmm I just tried it: https://slow.pics/c/lXE3Ogw7

    So provided my implementation works as intended...this isn't really a viable option xD.
    Are you writing of the white line around the middle picture? If so, I'd say whatever you did broke it.
    Last edited by manono; 13th Jun 2020 at 17:45.
    Quote Quote  
  27. No, I am referring to the serious warping of stuff, like the corner of the box and the right side of the mother around the house. The colour changing border is a 'feature' of slow.pics I think.
    Anyway, that duplicateframe(0) call is required to offset the interpolated clip. I did it again and it works much better now, I'll fix my code above.
    https://slow.pics/c/1yW7Qsfx
    Also manono , is FixBadFrames very slow if it is called many times? Would it be better to do the MTools stuff globally once, then just call remap frames for the required frames?

    I see that doesn't work as you need to interpolate after the freezeframe/frame replacement.
    Last edited by nacho; 13th Jun 2020 at 21:48.
    Quote Quote  
  28. Originally Posted by nacho View Post
    Also manono , is FixBadFrames very slow if it is called many times?
    As far as I know, it's okay. However, FreezeFrameMC itself has a memory leak or something as eventually you'll run out of memory if you call it enough times. At least I do, on 32bit Windows 7. I call it about 75 times (partly depending on what else is in the script) and I'm done for that section. Once it happens I encode that section losslessly (Lagarith), start on the next section and rejoin the pieces when all done.

    But I work with long movies and you're working on episodes (?). So it shouldn't happen to you. I've never noticed a slowdown when working with that function, or at least nothing serious.
    Quote Quote  
  29. What's the reason you use lagarith? Is there anything wrong a lossless x264 encode (crf 0)?

    Also some QTGMC settings cause srestore to miss/find more blends. Is there any logic to this? I've tried a few different qtgmc settings resulting in differing success with srestore's blend detection. Is there any logic to choosing QTGMC settings to allow srestore to pick up as many blends as possible?

    It seems that 'ChromaMotion' in qtgmc makes a fairly large difference, on and off with the same other settings causes srestore to miss and detect different blends. I have an entire episode encoded losslessly both with chromamotion on and off, how can I find the frames which are different? If I can do that then I can just replace the blends that one missed, with the frames from the other. This way I'm using 'good' frames when they exist. Finally I can watch all the scene changes and use freezeframemc on blends that srestore still didn't get.

    There are a few deduplication filters for avisynth but I don't know if they can do I want. I'll interleave the two videos (which are mostly near-identical) remove BOTH duplicate frames and leave both when they differ. So I can go through and look at all the different pairs and determine which is the cleaner one. Any ideas for how to do this?
    Last edited by nacho; 14th Jun 2020 at 20:21.
    Quote Quote  
  30. In my opinion, the only thing wrong with a lossles CRF 0 encode is the file size can possibly be 5x+ the size of the source video. It's needlessly large. If you are willing to use up that much space, then I guess there's nothing wrong with CRF 0 encoding.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!