VideoHelp Forum




+ Reply to Thread
Page 2 of 2
FirstFirst 1 2
Results 31 to 40 of 40
  1. I was tired from a long day when I posted earlier. So I appologize for the unfocused writing, as well as some easily misunderstood wording.

    EdDV brought up the resolution chart, which was where I got my (admittedly guesstimated) max resolution of 1000X700 for video tape. Let me try to recap some confusion and hopefully ad something worthwhile to this thread.

    The OP asked why Film can be improved and used to create beautiful new digital editions, while Videotaped sources have artifacts and bad quality. Let's return to that.

    Film is limited by grain size. no more, no less. All other factors (ISO, chemicals, etc.) are directly related to the grain in the film stock. In general, faster film has larger grain, slower film has smaller grain. Realize that the grain is NOT consitient in size and placement from frame to frame. With this said, detail from one frame can be used to reconstruct lost detail in the following frame where a larger grain hid it. (This is on the order of 1-3 pixels out of 4000, so this is a minute improvement, but it DOES exist.)
    Through the use of Filters, the original film source can be cleaned and resolution can be improved to the max possible for the finest grain found on the film stock. (i.e. if one frame has a resolution of 1200 dpi, while the next has 1000 dpi, with one following at 1500, and a fourth at 800, the overall film can be filtered and cleaned to a realistic resolution of 1300-1400 dpi per frame (allowing for a lowest common resolution for the entire film)

    This means that the average film has an attainable resolution of 2K to 4K. Now add the use of Fractal scaling technology. Genuine fractals ( http://www.ononesoftware.com/products/genuine_fractals.php ) is a photographic version I use in Photoshop on a regular basis. This allows you to scale UP images and actually gain resolution. (NOT details, resolution. Check the site for examples to se the difference.)

    With that said, film can be used to generate a final resolution of at least 5K and up to 40K with NO artifacting, banding, etc. (If you couldn't read the text on a sign in the background, you STILL won't. But, the visible details will be appropriately scaled and look ideal.)

    So, film is an ideal source for incredibly beautiful and detailed HD products.

    Videotape, on the other hand, has very specific limitations, as describe very well above by others. I have seen factory produced videotapes where the percieved resolution was such that I could capture images with 700 verticle pixels and the image looked correct, not resized or stairstepped (home video is severely limited due to cheaper hardware etc.) I agree that 480-576 is a realistic videotape limit. I just used 700 as a MAXIMUM possible resolution.

    If we try to use Fractal technology on the videotape source, the severe LACK of resolution limits the potential increase to about 2K at the max. (Fractals use lines, curves, gradients, etc. to extrapolate the required information to fill in the expanded resolution. With only 480 pixels available, there isn't enough data to accurately extrapolate, resulting in artifacts and errors.) Think of it this way: at 480 lines, a rope in the frame is 1.5 pixels wide. it runs from the middle bottom to the upper right. It shifts one pixel to the right for every 2 pixels it moves up. The poor recording quality does not provide a smooth, straight line, but a stairstepped diagonal. The fractals see this as a straight line in one frame, getting it right and extrapolating correctly. But in the next frames, part of the rope is obscured, leaving only 1/3 visible. The fractals incorrectly interpret this is a staircase-shaped pattern, and generate a series of much larger, sharply defines stair-steps. If we are watching the result, the starsteps are sen as artifacts due to the poor source.

    So, realistically: Film can be scaled up to incredible limits, while videotape has an intrinsic limitation due to the miniscule amount of detail available.
    Quote Quote  
  2. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by 2Bdecided
    ...
    You've got me wondering now though - maybe the series that I always assumed were edited on film (because they look fine in the UK - better than they do in the USA, because our digital SD isn't usually so overcompressed, and of course our SD analogue was always better ) - maybe they were dual edited on NTSC and PAL video.

    I can certainly remember series shot on video (through the 1970s, 1980s, and 1990s) that looked soft and juddery when shown over here. I have a feeling though that at least some series shot on film have had the same problem. The standards conversion used at the time (1980s / 1990s) wasn't very good at all.

    Cheers,
    David.
    I've been talking from a "Hollywood" perspective and for major internationally syndicated US TV series and made for TV movies.

    Prior to the early 80's most PAL export distribution was done on film prints or film transfers to PAL 2" Quad tape or later to 1" Type B/C tape. Increasingly through the 80's NTSC video masters were standards converted. Usually the NTSC tape was ordered and standards converted in Europe. These were the bad years for PAL watchers unless the show was also natively edited for a PAL release.

    From the mid 70's and through the 80's most Hollywood post production went electronic although most series were still shot on film..

    Local Europe TV production stayed on film (origination and editing) much later than in Hollywood. Most of this was due to government ownership of main networks and inflexible union rules. Electronic production (PAL native) didn't hit strong until mid to late 80's beginning in independent production houses. The BBC was an interesting case. I was surprised how little major production was done in video. Major productions were cut on film. Almost all BBC video edit suites were doing local news/sports or instructional programming during the 80's. This explains why the BBC major exports looked so good in the USA since the edit master existed in film. These can easily be remastered to HD.

    Back in Hollywood in the early 90's, it was laserdisc and anticipation of DVD and HD that caused them to get serious about edit release quality. There was a slow move away from D2* digital composite NTSC tape to D1 or 10 bit DigiBeta 4:2:2 digital component tape. As they made the conversion, the quality of PAL standards conversion also improved. It wasn't until the late 90's that full component 4:2:2 HDCAM edit suites upped the quality again. These suites could be dual standard.

    So for consumers of DVD and Blu-Ray, this means quick access to movie and TV series that have film or HDCAM edit masters. The quick route for PAL Blu-Ray is from film or transcoded HDCAM edit masters. If they have to go back to camera source film, there is a major remastering investment needed to get to Blu-Ray.


    * Star Trek TNG and early episodes of "Friends" would have been edited to composite NTSC D2 masters. "Friends" would have moved to component DigiBeta some time around 1995.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  3. Member 2Bdecided's Avatar
    Join Date
    Nov 2007
    Location
    United Kingdom
    Search Comp PM
    Thank you edDV - very informative!

    Do/did you work in the industry over there?


    @mpiper,

    I think the output of genuine fractals looks quite artificial, as do all the other upscaling algorithms I'm aware of. You need to take more care with video - a lot of these algorithms are temporally unstable, and while they might look OK on a single frame, they look downright strange in motion.

    I'm not convinced there's much difference between film and video in the percentage that it can be successfully blown up. 4k is at or beyond the point of diminishing returns for most 35mm film IMO. Native 4k digital images are far far sharper, though good 4k video is rather rare for now.

    The amount to which video can be sharply upscaled depends on the amount of aliasing in the original. Aliasing is "bad", but super resolution techniques can take advantage of it to make genuinely sharper higher resolution versions. Whereas an original without aliasing is always going to upscale to a soft or artificial looking larger version of itself.

    Cheers,
    David.
    Quote Quote  
  4. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by 2Bdecided
    Thank you edDV - very informative!

    Do/did you work in the industry over there?
    I've been in the broadcast video equipment business for many years with several companies.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  5. Why is is analog scanning so much better than digital?

    One thing that popped into my mind. Would it b better for a diret overscan (upconvert) of a standard definition signal to full-hd resoution and quality, or woud it be better to somehow trnsfer it to analog film and proceed with optical scanning with the classic zooming methods (the likes of taking the 35mm and making it 4k on res)?
    Quote Quote  
  6. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by therock003
    Why is is analog scanning so much better than digital?

    One thing that popped into my mind. Would it b better for a diret overscan (upconvert) of a standard definition signal to full-hd resoution and quality, or woud it be better to somehow trnsfer it to analog film and proceed with optical scanning with the classic zooming methods (the likes of taking the 35mm and making it 4k on res)?
    I'm not following? Analog scanning what?

    If you transfer 720x480 video to film, it won't scan with any more detail.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  7. Films dating back to before the talkies are very high resolution. Film itself is much much higher resolution than video. There are now automated programs that can get rid of scratches and noise in films. There's lots of manual labor involved too, that keeps the industry happy and employed!!

    But film itself from the 1930's on up was ALWAYS better than high definition TV now, though mpeg2 HDTV is already capable of 4K. There are variations in the quality of film.

    It is sort of like the same problem you get with digital versus analog reel to reel tape in recording. Analog tape has some noise, but the resolution is about 4 times that of the Compact Disc. Audio from the 50's through the early 80's is able to be improved because of the analog sources. With Digital, you are stuck with the sample rate / resolution of the source material, where analog, even when not properly calibrated, is still higher than CD 44.1

    Film was designed to be blown up and displayed on a big screen, thus you needed higher resolution. In television, there was a time when editing was EASIER to do with film than with tape. In broadcast, the move to videotape was done for some programs partly because of production time - film needs to be developed and copied, and that can't be done for tonight's newscast! Film equipment was also very costly to manufacture, maintain and operate. It also takes alot of space to store.


    But there were lots of syndicated shows and tv series that were shot on film and then converted to video. Sadly, some of the original films were lost or destroyed. In the 1980's everyone in the industry new that Japan had high def analog, and thus there was maintenance and archival of films of TV.

    You will see some Seinfeld episodes in film and some on standard def video and some in high def video.

    For both sound and video, the worse era in history is the 1990's, with lossy compression being used with standard def video and the un-restorable mp3 file. in audio.

    Film is and always was future-proof. And yes, companies like Kodak and Fujifilm made "High Definition" film, having more smaller particles than standard film. And they did this in the 1970's!!

    I knew about this in the 1990's, and avoided the DVD all together. Early DVDs were often just copies of Laser Discs or tapes, and the encoding quality just wasn't there.

    Basically, the answer is that Film always was a high-resolution format, from the beginning. Sadly, in 1949, RCA had demonstrated the first high resolution TV set which consisted of a rear projection set with a camera set up to demonstrate their system. But it was never able to be implemented because of the stupid, technology killing FCC, which prevented analog tv in the US from being broadcast in high def. In 1984, the summer olympics were broadcast in France in HD, which was analog. In 1986, the Super Bowl had two cameras that recorded in 1280 X 720 on Sony Digital Lossless format. Lossy video was first designed with Mpeg 1 in 1988.

    But, looking back on history, the worst video was certainly the 1990's.
    Quote Quote  
  8. Originally Posted by Neil Schubert View Post
    Films dating back to before the talkies are very high resolution. Film itself is much much higher resolution than video. There are now automated programs that can get rid of scratches and noise in films. There's lots of manual labor involved too, that keeps the industry happy and employed!!

    But film itself from the 1930's on up was ALWAYS better than high definition TV now, though mpeg2 HDTV is already capable of 4K. There are variations in the quality of film.

    It is sort of like the same problem you get with digital versus analog reel to reel tape in recording. Analog tape has some noise, but the resolution is about 4 times that of the Compact Disc. Audio from the 50's through the early 80's is able to be improved because of the analog sources. With Digital, you are stuck with the sample rate / resolution of the source material, where analog, even when not properly calibrated, is still higher than CD 44.1

    Film was designed to be blown up and displayed on a big screen, thus you needed higher resolution. In television, there was a time when editing was EASIER to do with film than with tape. In broadcast, the move to videotape was done for some programs partly because of production time - film needs to be developed and copied, and that can't be done for tonight's newscast! Film equipment was also very costly to manufacture, maintain and operate. It also takes alot of space to store.


    But there were lots of syndicated shows and tv series that were shot on film and then converted to video. Sadly, some of the original films were lost or destroyed. In the 1980's everyone in the industry new that Japan had high def analog, and thus there was maintenance and archival of films of TV.

    You will see some Seinfeld episodes in film and some on standard def video and some in high def video.

    For both sound and video, the worse era in history is the 1990's, with lossy compression being used with standard def video and the un-restorable mp3 file. in audio.

    Film is and always was future-proof. And yes, companies like Kodak and Fujifilm made "High Definition" film, having more smaller particles than standard film. And they did this in the 1970's!!

    I knew about this in the 1990's, and avoided the DVD all together. Early DVDs were often just copies of Laser Discs or tapes, and the encoding quality just wasn't there.

    Basically, the answer is that Film always was a high-resolution format, from the beginning. Sadly, in 1949, RCA had demonstrated the first high resolution TV set which consisted of a rear projection set with a camera set up to demonstrate their system. But it was never able to be implemented because of the stupid, technology killing FCC, which prevented analog tv in the US from being broadcast in high def. In 1984, the summer olympics were broadcast in France in HD, which was analog. In 1986, the Super Bowl had two cameras that recorded in 1280 X 720 on Sony Digital Lossless format. Lossy video was first designed with Mpeg 1 in 1988.

    But, looking back on history, the worst video was certainly the 1990's.
    Analog: very high quality which degrades slowly over time.

    Digital: fixed quality and never degrades but the MOMENT it does you lose it ALL.
    Quote Quote  
  9. Member netmask56's Avatar
    Join Date
    Sep 2005
    Location
    Sydney, Australia
    Search Comp PM
    ....and then there was 3 stripe Technicolor
    SONY 75" Full array 200Hz LED TV, Yamaha A1070 amp, Zidoo UHD3000, BeyonWiz PVR V2 (Enigma2 clone), Chromecast, Windows 11 Professional, QNAP NAS TS851
    Quote Quote  
  10. Member bendixG15's Avatar
    Join Date
    Aug 2004
    Location
    United States
    Search Comp PM
    Happy New Year ... Lets resolve histories mysteries.......... Dig up more old stuff
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!