Hi all,
this is more of an observation but a question which you may enjoy anyway.

While reading al lot of posting on compression I found that people reported
extreme differences in encoding run time on their machines when they did an
MPEG-2 file. For example yesterday I compressed the first 32 minutes from
End of Day into SVCD NTSC Film. It took TMPGEnc six and half hours to do
that. I used CQ 2520 kbps @100% quality. Last night I started a cruel
compression experiment to fit the whole movie on just ONE 700 MB CD-R! This
means an average video data rate of a bit more than 600 kbps and an audio
rate of 160 kbps. BUT this time I used 2pass VBR with 2500 max. I know that
2pass doubles conversion time. TMPGEnc. estimates the total time for this
job at 52 hours which would be eightfold to my preevious 32 minute segment.
End of Days ist just over 2 hrs long so this time span is quite exactly 4 *
2 * 6.5.
It appears that total compression time does not depend on the data rate
chosen, but is constant on the same machine. Has anyone else made that
operation?
Logic would normally tell me that more extreme compression (like 600 kbps
instead of 2500) requires more number crunching than just doing a "little"
compression on the signal, but htat does not seem to be the case.
Regards,
Ralf