This is a bit of a rant, sorry.

It bugs me when people say "Analog is Analog" as a dismissive statement, because it adds no value to a discussion of digitizing an analog signal.

A NTSC analog picture is continuous along the horizontal (width). The picture is about 52.666 microseconds in length. It is like a string, not a dotted line. It has a length not a pixel count.

Fair enough.

But, if you are going to use a digital computer on the analog signal, you must digitize it. You must break it into pixels. That is what capture is, that is what this forum is about.

Can you digitize and not loose significant detail of the analog signal? Well, all of the digital mediums today suggest yes. (DVD, Digital Cable/Satalite, D1)

Conclusion: Analog may be Analog, but if you are going to follow a standard method to digitize it, it can be compared to digital formats. It can be discussed in terms of pixels.



----------------
At the risk of being long winded, I wanted to include an analogy to make this more clear.

Fractions are a more precise measurement of analog, than decimals. Some Fractions can not be represented fully using decimal notation. Digital computers work with numbers in a format closer to decimal than to fractions. Does it trouble you if your fractions are not fully there on your computer ? :P