VideoHelp Forum
+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 43
Thread
  1. I'm trying to remove all the duplicates of this animated series which is difficult because a large amount of duplicates score higher than many legitimate non-duplicates such as slow panning or mist/clouds. The reason is probably because this cartoon is full of warping which I cannot fix.

    But I noticed that this doesn't have to be so difficult because a cartoon essentially has 3 kinds of framerates, full, half and 1/3. The dup scores on these are rather predictable on a graph. When there's a duplicate every second frame, the graph shows spikes. Legit frames are many times the value of the duplicate so they can be filtered based on that. This way legit frames with small values won't disappear. But I don't know how to do this for scenes where every third is a legit frame:

    36.6219 at -32,288 10.38388908
    3.5268 at (0,320) 0.541460045
    6.5135 at -224,384 0.188867174
    34.4872 at -160,320 7.103439753

    The fourth column is a formula in excel dividing a value by its previous. The threshold I set is to delete any frame with a score lower than 0.5, that is any frame that's succeeded by one with twice the value. Frames 2 and 3 should be deleted but only the third one will be using this.
    Can anyone think of an Excel formula that'll take 3 frames into account instead of 2?

    This is only a test run btw and I'm not sure what will result, I might have to change the threshold.
    Quote Quote  
  2. Do you know about Dup() and DeDup()? Keep in mind that exact duplicate frames cost almost nothing in any inter frame encoding.
    Quote Quote  
  3. If those are typical numbers, that's a pretty big margin, you should be able to adjust the settings within dedup (there should be no reason to use excel) . Or is this question more about excel ?
    Quote Quote  
  4. jagabo, I have only used DeDup. And that duplicates cost almost nothing is completely false. Removing 50% of frames (all of them exact duplicates) decreased bitrate by 25% in 10 different animated videos I tested.

    Originally Posted by poisondeathray
    If those are typical numbers, that's a pretty big margin, you should be able to adjust the settings within dedup (there should be no reason to use excel) . Or is this question more about excel ?
    It's more a question about excel or any other program/method that might help manage DeDup's data. Because DeDup alone can't do anything. Check this scene for example, every second frame is a duplicate but the duplicates have values as high as 10.
    Setting the threshold that high will remove all these duplicates and then kill all the non-duplicates in other scenes with values far below that. So this is useless. What I need is a way to kill any small value consistenly surrounded by a big value which is easily done by dividing the first number by the second. This solves this scene's problem and would leave alone slow panning scenes with small values that aren't surrounded by big values.

    frm 1441 diff from frm 1442 = 35.9146 at "-256,288" 6.475182548
    frm 1442 diff from frm 1443 = 5.5465 at "(608,0)" 0.179759003
    frm 1443 diff from frm 1444 = 30.8552 at "-320,256" 10.61629507
    frm 1444 diff from frm 1445 = 2.9064 at "-256,160" 0.111036401
    frm 1445 diff from frm 1446 = 26.1752 at "(384,96)" 4.630484008
    frm 1446 diff from frm 1447 = 5.6528 at "(320,96)" 0.216141811
    frm 1447 diff from frm 1448 = 26.1532 at "(384,96)" 5.487452791
    frm 1448 diff from frm 1449 = 4.766 at "-512,320" 0.209058011
    frm 1449 diff from frm 1450 = 22.7975 at "-320,256" 7.026073289
    frm 1450 diff from frm 1451 = 3.2447 at "-64,288" 0.190551976
    frm 1451 diff from frm 1452 = 17.0279 at "(384,96)" 5.253254766
    frm 1452 diff from frm 1453 = 3.2414 at "-256,288" 0.200646248
    frm 1453 diff from frm 1454 = 16.1548 at "(384,96)" 5.420346262
    frm 1454 diff from frm 1455 = 2.9804 at "-320,256" 0.228211764
    frm 1455 diff from frm 1456 = 13.0598 at "-320,256" 11.73492677
    frm 1456 diff from frm 1457 = 1.1129 at "-320,192" 0.159372762
    frm 1457 diff from frm 1458 = 6.983 at "-256,160" 3.499373591
    frm 1458 diff from frm 1459 = 1.9955 at "(384,96)" 0.024428432
    frm 1459 diff from frm 1460 = 81.6876 at "-224,160" 29.42318914
    frm 1460 diff from frm 1461 = 2.7763 at "-320,256" 0.121364592
    frm 1461 diff from frm 1462 = 22.8757 at "-320,320" 18.77519698
    frm 1462 diff from frm 1463 = 1.2184 at "-160,192" 0.029882227
    frm 1463 diff from frm 1464 = 40.7734 at "-320,288" 7.880288359
    frm 1464 diff from frm 1465 = 5.1741 at "(0,352)" 0.136040238
    frm 1465 diff from frm 1466 = 38.0336 at "-352,224" 20.69743143
    frm 1466 diff from frm 1467 = 1.8376 at "(0,288)" 0.045935177
    frm 1467 diff from frm 1468 = 40.0042 at "-320,160" 17.44927157
    frm 1468 diff from frm 1469 = 2.2926 at "(0,352)" 0.039427047
    frm 1469 diff from frm 1470 = 58.1479 at "-320,128" 45.35717629
    frm 1470 diff from frm 1471 = 1.282 at "(192,32)" 0.021224533
    frm 1471 diff from frm 1472 = 60.4018 at "-352,160" 20.98960976
    frm 1472 diff from frm 1473 = 2.8777 at "(0,384)" 0.057400397
    frm 1473 diff from frm 1474 = 50.1338 at "(384,96)" 11.24858085
    frm 1474 diff from frm 1475 = 4.4569 at "(192,32)" 0.096459049
    frm 1475 diff from frm 1476 = 46.2051 at "-320,128" 22.12675989
    frm 1476 diff from frm 1477 = 2.0882 at "(352,64)" 0.052623223
    frm 1477 diff from frm 1478 = 39.6821 at "-320,128" 20.97362579

    But it's the parts with 2-duplicates-1-legit pattern that I'm trying to figure a way to fix, because this method won't work with those.
    Quote Quote  
  5. Did you try the other usual approaches, manipulating the data before dedup?

    e.g. use dedup on a denoised or temporal stabilized version, and apply the results to the original . For example if noise or warping makes 2 duplicates "look different", denosing or heavy temporal stabilizing should increase the accuracy
    Quote Quote  
  6. Originally Posted by -Habanero- View Post
    jagabo, I have only used DeDup. And that duplicates cost almost nothing is completely false. Removing 50% of frames (all of them exact duplicates) decreased bitrate by 25% in 10 different animated videos I tested.
    You did not remove exact duplicate frames, you removed frames which differed by only a small bit. If you replace all the near duplictates with exact duplicates you will get a difference of maybe 1 percent.
    Quote Quote  
  7. Originally Posted by poisondeathray View Post
    Did you try the other usual approaches, manipulating the data before dedup?

    e.g. use dedup on a denoised or temporal stabilized version, and apply the results to the original . For example if noise or warping makes 2 duplicates "look different", denosing or heavy temporal stabilizing should increase the accuracy
    That's what I usually do but it won't work here. Temporal denoising/smoothing won't fix the warping and in addition to that will just screw up legitimate frames in slow pans or cloud movements.

    Originally Posted by jagabo View Post
    You did not remove exact duplicate frames, you removed frames which differed by only a small bit. If you replace all the near duplictates with exact duplicates you will get a difference of maybe 1 percent.
    You are completely wrong, I did remove exact duplicates. I know that because I used ExactDeDup instead of DeDup. It was a RGB32 video which DeDup can't process.
    Quote Quote  
  8. Not sure how to do it in excel or if it will even be accurate enough. What about less that 0.6 for excel ? how are the other scenes ?

    Can you describe the warping, pattern, frequency, areas of frame affected ?

    dedup only checks 1 condition , but you might be able to set multiple conditions with writefileif() and using runtime functions . You have to satisfy several logical conditions for it to write the frame number
    Quote Quote  
  9. Yeah looks like I'll have to play around with excel a bit more or ask Microsoft.

    What about less that 0.6 for excel ? how are the other scenes ?
    You missed the point, the rightmost column is simply the Dup value divided by the next frame. This nullifies the "spikes" in parts where every second frame is legit while not touching scenes with no duplicates, even if they're all tiny values. 10 frames all with a dup value of 0.2 would become 1.0 which is far above 0.5 or 0.6. But in scenes where every third frame is unique, say 30 3 2.5 40 would become 10 1.2 0.0625.
    Threshold of 0.5 would kill only the third duplicate but leave the second one intact. Raising it to 1.2 would kill numerous legit frames.

    dedup only checks 1 condition , but you might be able to set multiple conditions with writefileif() and using runtime functions .
    But I imagine this would be much more complicated than fixing the Dedup log. Setting conditions such as brightness differences and such would reintroduce the problem of a blurry threshold. I like Dedup's condition very much, I only wish instead of a threshold for it to recognize patterns like a spike every second or third frame and deduplicate accordingly.

    Whatever, I'll figure something out.
    Quote Quote  
  10. Yes writefileif() will be more complicated than adjusting a dedup log. But setting multiple conditions will definitely make it much more accurate. Also, it will only give you the frame numbers, it won't give you the VFR codes
    Quote Quote  
  11. So I tried =A1/(A2+A3) formula which works on the 1/3 FPS parts and surprisingly on the other parts as well. This got rid of 30% of the frames and lowered file size by 20%, jagabo.

    Unfortunately, this formula isn't effective enough. It always deletes 2 frames before a scene change which are often not duplicates and I overlooked one serious problem: there are scenes where the background runs at full FPS while the large object is at half FPS. This too produces "spikes" and the frames where only the mostly-covered background moves are deleted.
    This I will need to rethink how much bigger a spike must be to qualify as one after Microsoft customer service gives me a tip how to better detect spikes.

    The good news is this method doesn't really affect scenes with little or no movement. Only like one legitimate frame is randomly removed per scene.
    Quote Quote  
  12. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Animations is not my thing so I suppose I can be educated on this.

    Why would anyone want to remove duplicates, aren't they there to give the animation the pace the director wanted?

    Quote Quote  
  13. Originally Posted by newpball View Post
    Animations is not my thing so I suppose I can be educated on this.

    Why would anyone want to remove duplicates, aren't they there to give the animation the pace the director wanted?

    The video is converted to VFR (variable frame rate) after the duplicates are removed. The pace remains the same and the compression ratio is increased.
    Quote Quote  
  14. Oops, wrong thread.
    Quote Quote  
  15. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Originally Posted by -Habanero- View Post
    Originally Posted by newpball View Post
    Animations is not my thing so I suppose I can be educated on this.

    Why would anyone want to remove duplicates, aren't they there to give the animation the pace the director wanted?

    The video is converted to VFR (variable frame rate) after the duplicates are removed. The pace remains the same and the compression ratio is increased.
    With motion-compensated prediction in inter-frame compression, you really gain no bitrate benefit of VFR over CFR (for the same quality), and you incur motion aliasing artifacts (because of changing timebase) and player incompatibilities. All downsides and no upsides. Why put yourself (and others) through that?

    Scott
    Quote Quote  
  16. Originally Posted by Cornucopia View Post
    With motion-compensated prediction in inter-frame compression, you really gain no bitrate benefit
    But there is a limit of how many frames can be referenced which is 16. Content such as animation and 2D video games benefit linearly from more refs. When you double the frame rate, 16 refs effectively become 8. Removing duplicates is incredibly beneficial.

    of VFR over CFR (for the same quality), and you incur motion aliasing artifacts (because of changing timebase) and player incompatibilities. All downsides and no upsides. Why put yourself (and others) through that?

    Scott
    I don't care about player incompatibilities, this is probably the least relevant argument to ever invoke. MKVs still have no mainstream support and sure as hell didn't 10 years ago where they were STILL widely popular. H264 took forever to be widely hardware-compliant. So glad I didn't waste time waiting and using shitty formats in the meantime.
    Quote Quote  
  17. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by -Habanero- View Post
    But there is a limit of how many frames can be referenced which is 16. Content such as animation and 2D video games benefit linearly from more refs. When you double the frame rate, 16 refs effectively become 8. Removing duplicates is incredibly beneficial.
    Great!
    I applaud any effort to maintain or improve quality!

    I take it that you folks use pretty high bit rates with tons of I-frames to keep the quality a-ok?

    Quote Quote  
  18. DECEASED
    Join Date
    Jun 2009
    Location
    Heaven
    Search Comp PM
    Originally Posted by -Habanero- View Post
    When you double the frame rate, 16 refs effectively become 8.
    Even if/when you use longer and/or open GOPs

    You know, "10 times the framerate" is not a hard-coded upper limit to x264...
    Quote Quote  
  19. Originally Posted by newpball View Post
    I take it that you folks use pretty high bit rates with tons of I-frames to keep the quality a-ok?
    Who are "you folks"? And more I-frames decrease the quality, actually.

    El Heggunte, who's talking about the I-frame interval? That has no noticeable effect quality or compression unless it's extremely low. A keyint larger than default has little benefit and whoever told you otherwise is a clueless moron.
    The frame buffer the refs control which is the range of arbitrary frame reodering on the other hand affects some content greatly.
    Doubling the frames will limit the amount of frames that can be strategically re-ordered in the buffer.
    Quote Quote  
  20. newpball is talking about very high bitrates, where having more I-frames would increase the quality
    Quote Quote  
  21. Member
    Join Date
    Nov 2009
    Location
    United States
    Search Comp PM
    Have you tried using the Compare function to compare adjacent frames? It's text output provides like 5 or 6 different metric numbers for analysis.
    Quote Quote  
  22. Originally Posted by poisondeathray View Post
    newpball is talking about very high bitrates, where having more I-frames would increase the quality
    I wouldn't use lossy compression at all in such a case.

    Khaver, no, I am trying something else at the moment. But I will put that on my list.
    Last edited by -Habanero-; 8th Jun 2015 at 00:58.
    Quote Quote  
  23. DECEASED
    Join Date
    Jun 2009
    Location
    Heaven
    Search Comp PM
    @ El Habanero: thanks for NOT answering my question

    P.S.: I'm leaving this thread, since the O.P. has finally confirmed he's a troll.
    Last edited by El Heggunte; 8th Jun 2015 at 05:22. Reason: add P.S.
    Quote Quote  
  24. Originally Posted by El Heggunte View Post
    @ El Habanero: thanks for NOT answering my question

    P.S.: I'm leaving this thread, since the O.P. has finally confirmed he's a troll.
    You did not ask a question, you made a statement which I believe I addressed.
    I was pondering on completely ignoring your post because I didn't feel like discussing the tiniest technical details of x264.
    And look what I get for not doing so...
    Quote Quote  
  25. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by -Habanero- View Post
    MKVs still have no mainstream support and sure as hell didn't 10 years ago where they were STILL widely popular.
    Almost every recent TV or Blu-Ray player from one of the major consumer electronics brands is able to play MKV files, with some restrictions. (These devices impose some restrictions on every container file format they play.) What more is needed to qualify as mainstream support?
    Last edited by usually_quiet; 8th Jun 2015 at 11:21.
    Quote Quote  
  26. Originally Posted by usually_quiet View Post
    Almost every recent TV or Blu-Ray player from one of the major consumer electronics brands is able to play MKV files, with some restrictions. (These devices impose some restrictions on every container file format they play.) What more is needed to qualify as mainstream support?
    Keyword: recent. How long has MKV been out? How long has it been incredibly popular before it became hardware-compliant?
    What would be your argument in 2006? "It will be supported soon!"?
    Quote Quote  
  27. Okay, this is too weird. I tried a new excel formula on the dedup log to process it in both directions and it left me with exactly an FPS of 17.5 which from my experience is the average FPS you end up with when removing duplicates from a 24p cartoon. But I'm serious, it cut this 23.976 fps to exactly 17.500.
    When you're math-retarded like me, you don't believe this can be just a coincidence.
    But for the most part, it is. This method fixes the problem of legits being removed before a scenechange and less legits removed in general but it still happens occasionally.
    I could live with this.

    Khaver, does the compare function output timecodes?
    Quote Quote  
  28. Member
    Join Date
    Nov 2009
    Location
    United States
    Search Comp PM
    As far as I can tell just frame numbers.
    Quote Quote  
  29. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by -Habanero- View Post
    Originally Posted by usually_quiet View Post
    Almost every recent TV or Blu-Ray player from one of the major consumer electronics brands is able to play MKV files, with some restrictions. (These devices impose some restrictions on every container file format they play.) What more is needed to qualify as mainstream support?
    Keyword: recent. How long has MKV been out? How long has it been incredibly popular before it became hardware-compliant?
    What would be your argument in 2006? "It will be supported soon!"?
    The MKV project started in 2002, although as far as I can tell, it is still unfinished. In 2015 the menu feature still hasn't been implemented, and there is no universally accepted standard governing metadata.

    I wouldn't say MKV was incredibly popular in 2006. Going by the posts I remember seeing at VideoHelp in those days, SD XVid or DivX video in an AVI container was far more popular than MKV.

    In 2006, many things were different. Most people did not have an LCD TV or a Blu-Ray player, and if they did, those devices did not play media files. There was limited support for SD XVid or DivX video in an AVI container among DVD players.

    I'm really more concerned about what is true about MKV support for today's consumer electronics. I would say that today, MKV support there is mainstream, but if you want to encode your video so that it won't play properly except with a PC, that's your prerogative.
    Quote Quote  
  30. Originally Posted by usually_quiet View Post
    The MKV project started in 2002, although as far as I can tell, it is still unfinished. In 2015 the menu feature still hasn't been implemented, and there is no universally accepted standard governing metadata.

    I wouldn't say MKV was incredibly popular in 2006. Going by the posts I remember seeing at VideoHelp in those days, SD XVid or DivX video in an AVI container was far more popular than MKV.
    I meant for H264. Most did not use AVI or MP4 for AVC video. Hell some put Xvid into MKV as well because AVI didn't support Xvid without hacks either.
    But you answered your own question. Xvid in AVI was insanely popular and nothing except a PC would play them.
    That should tell you that people could care less for hardware compliance. Nobody watches TV anymore and people use Blu-ray players to watch actual Blu-ray movies, not highly-compressed substandard crap from the internet.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!