I recorded a tv show off my over the air receiver which is a 720p 59.940fps capture instead of the 1080i ones I'm used to.

I used SelectEven() to cut the fps in half for 29.970 fps. I know TDecimate() is used to remove duplicate frames and the default is the 5th one which would give me a fps of 23.976, but I couldn't tell if I even needed to by looking at the source.

I've linked a 30 second sample if anyone has a moment to discuss. I record several shows that are suppose to be 29.970 and some that are not, is there a simple way to tell when it is necessary decimate if your too blind to see duplicate frames lol?

sample http://www.mediafire.com/?lknyn4ivnxc9a2p