Hello there -
I'm wondering if some programming /scripting genius out there can help me.
I do a lot of TV show captures, and I edit the commercials out with Virtual Dub.
I save the processing settings as a Virtual Dub .vcf file.
I WAS frameserving each episode individually, due to the fact that if i had 2 1/2 hour episodes
in the same .avi file, and 2 seperate .vdr files, CCE would encode the 1st one twice if I tried to add it to it's queue.
So what i have been doing recently is this:
In the .vdr file, it has the values of my cut points in frames, like this:
VirtualDub.subset.AddRange(54289,4048);
VirtualDub.subset.AddRange(60149,13823); <---ie,@ frame # 60149 for a duration of 13823 frames
VirtualDub.subset.AddRange(81345,19980);

so what I have done is this:
i add the start # (60149 + 13823) to get 73972
and use these values in CCE as a range to encode (start @ 60149 /stop @ 13823)

can anyone please write me some sort of script or something that will do the math and put it into a format that i can paste into a CCE .ecl file? or maybe one of those cool script convertor type programs, commandline driven like coolapp.exe input.vdr output.ecl

[file]
name=E:\70s.AVI
type=0
frame_first=0
frame_last=108511
encode_first=54289
encode_last=58337

[file]
name=E:\70s.AVI
type=0
frame_first=0
frame_last=108511
encode_first=60149 <--- see here?
encode_last=73972 <---

[file]
name=E:\70s.AVI
type=0
frame_first=0
frame_last=108511
encode_first=81345
encode_last=101325

i would be very grateful if anyone could help me out with this.
thanks a bunch.
Mole