Why is it that we cant greatly improve videos with artifacts and bad quality coming from vhs or analog sources, and companies take old movies and turn them to full-hd with great quality? What methods do they use that are not know to public?
+ Reply to Thread
Results 1 to 30 of 40
-
-
The quality of the old movies is actually quite good. It is just limitations of DVD and especially VHS
-
Casablanca has been recently released on Bluray in full 1080p wonder. How did they do it ?
The started with original 35mm prints. They scanned it frame by frame at 4k x 4k resolution. They then hand visited every frame to paint out scratches and marks. They they rebalanced and graded the footage. Finally they resize it down to 1080p resolution and encoded it for Bluray.
Believe me, if the studios started out with a clip from youtube, it wouldn't look any better than what you or I could do.
The studios have better quality source material - i.e. actual prints - not low resolution VHS tapes. They have high quality scanners than can scan a 35 frame so it is over 4000 x 4000 pixels - not 720 x 576. They have teams of trained people hand painting and repairing frame by frame - not one impatient person using a handful of virtualdub filters. They have access to records and people who were there when these films were created. They understand about film stock, and how certain techniques produce certain outcomes. And they have money. The restoration of Casablanca cost somewhere north of USD$1,000,000
There is no secret in how to do it properly.Read my blog here.
-
That was a fine answer, you said exactly what i needed to hear. Even the youtube statement answered something that puzzled me, cause I was wondering if they could take a youtube video and make it 1080p but you even answered that.
Thanx man. -
What is quite interesting is that a film from the 40's or a TV series from the 60's or 70's, being shot and edited on film, is available in very high quality and thus perfectly suitable for High Def. Whereas late 80's/90's TV-series were edited on standard def video, being low-res crap quality complete with colour-wash, and every and all other video artefacting.
Kind off paradoxical innit? -
raffie - Actually recording to video tape for TV shows was done by some in the 1970s. Some were filmed, some were videotaped. All In The Family, for example, was the first show that I know of to be videotaped. A lot of 30 minute sitcoms (situation comedies) from the 70s were videotaped. As guns1inger says, I think videotape became popular because it lowered costs.
-
Originally Posted by guns1inger
But yes, offcourse it reduced costs of the production.
Today video is shot digitally in high definition. Still pretty amazing there will be this 'gap' in production quality.
I'm not sure of wich ones by heart but many TV-shows I recall from the eighties clearly showed film artefacting (A-Team, Star Trek TNG, ...) -
What's amazing is that in another 20 years with these old films is that they'll be able to take the raw files and/or 35MM prints that remain in good condition and make them look even better on the next generation of video technology. There is still some quality to squeeze out of those 35mm frames.
-
Image capture on film does not automatically mean high quality.
If you examined a frame from a pristine negative of an unrestored Casablanca, I think most people would be surprised at the amount of grain and general softness of the image.
50's thru 90's shows shot on film are even worse. When you're shooting a half hour show every week, you didn't have time to get much of anything perfect.
Videotape was first used in studio-based set pieces (sitcoms). Film was (and still is, to a lesser degree) more portable and less technologically intensive than videotape (cheaper). ENG is the exception here.
Most of the savings that producers were to realize from HD Video evaporated when increased costs for lighting were factored in.
I like to watch old movies on some of the throwaway HD channels on Dish. The studios obviously just transferred the best release print they could find. They really have the film feel - warts and all. -
The Twilight Zone was one of if not thee first to employ the primitive videotape for entertainment. Serling's amazing stories were suddenly confined to indoor plays complete with rehearsal and camera blocking. Needless to say, the harsh, shallow, B & W result was found wanting and quickly abandoned for his purposes...
-
One item everyone seems to be sliding past is that film does NOT have a resolution issue, while Videotape does. Yes, everyone seems to be recognizing the qualitative difference, but the fact is that film is not a set number of pixels in width and height.
Film can be scanned at any resolution the user desires. Just as a comparative example, 3D graphics designed to be shown as part of a Feature film are rendered at resolutions around 4000 X 3000(as opposed to DV video or DVD which is 720X480).
So, imagine scanning film at 4000X2400, and then running filters to remove grain (2D and 3D noisefiltering). You end up with an image that can be scaled down to 1920X1080 (which also tends to blend out noise), resulting in a pristine image. This can also be sharpened at the 4K resolution, providing better clarity when scaled down.
Videotape, on the other hand, has a resolution of 640(720) X 480. Now pro grade has a higher resolution, since it IS analog, not digital, but the smallest details resolvable set the realistic limit to around 1000X700. Take something with that resolution, capture it at 1920X1080 (which is a re4sizing in the first place), and you get an image that is already at the limit of resolution and any cleaning/filtering will have to be done at that limit. So any sharpening must be done at 1920, resulting in harsher ringing/haloing than sharpening film. So you either don't sharpen, or do the best you can with what you have; knowing it will never be as clear as film.
Videotape shows DO look better at HD than SD, since the master videotape was better. But it will never match the sheer visible details of film. -
Originally Posted by mpiper
As a practical matter, film definitely has limits as to resolution. Aperture, shutter speed, lens, lighting, film stock, developing and printing all affect resolution.
I've read an article about scanning film with electrons (or some other particle / sub-particle) which revealed that normally exposed film emulsion unexpectedly contains 3D information. Not widely available or terribly practical for our purposes, but interesting.
The film scanning that I'm most familiar with is 2K (2048×1536 pixels) and 4K (4096×3072 pixels).
When you use the term "videotape", it seems you are referring only to analog videotape. Digibeta and other formats are recorded on videotape digitally.
We are supposed to compare resolution based on "pixels", yet it is a term that explicitly refers to digital images. Analog images (film or video) become pixels when they are digitized.
Lastly, videotaped shows do not necessarily look better at HD resolutions. It depends on the era, equipment used and tape condition, among other factors.
Can you clarify?
To zoobie: I didn't know that Rod Sterling had experimented with videotape in the first Twilight Zone series - cool! -
Originally Posted by raffie
Variety shows (e.g. Carol Burnett, Rowan and Martin, Saturday Night Live, etc.) and some sitcoms were shot in multi-camera video. These shows are upscaled and image processed for HD release. Shows with strong international export demand were shot on film for better multi-format distribution.
HD video production began in the late 1990s but didn't become popular until the mid 2000's. Most TV series are still shot on film. 1080p/24 HD production began a few years ago. This allows multi format distribution same as shooting film.
The newest live production trucks for events are multifomat allowing HD or SD production at 23.976, 24, 25, 29.97, 50 or 59.94 frames per second.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Just to add my $0.02...
Analog videotape has resolution that is determined by:
A. Scanlines (525/480 and 625/576) for vertical, but that is the theoretical maximum; it is further limited by playback head bandwidth, interchange, processing electronics, and whether the connection to A/D converter is component/YC/Composite, so the general effective resolution is probably 60-80% of that (although old VHS is MUCH worse).
B. Bandwidth, for horizontal (usually ~4-6MHz for broadcast, less for older consumer tape)-this roughly corresponds to what we generally regard as 4/3rds of the vertical (hey, those numbers look familiar!) or ~640-768; it is further limited the same way as the vertical
That's all for SD (there is analog HD, but it was mainly only used in Japan).
Digital, we all know the numbers for.
Film -DOES- have an effective resolution based on it's grain size (which is determined by it's ISO/speed and chemical technology), but it is possible to surpass that limit with "De-graining" (can increase perceivalbe resolution similar to the way dither can increase perceiveable audio dynamic range). Turns out though that the "grain" is often appreciated, so it's often kept in (or added back).
Scott -
There's so much speculation in this thread that's completely divorced from reality!
Analogue video has a fixed number of active lines. No more (not 700 mpiper!), no less (not 60-80% of that Cornucopia). The camera and electronics might not have delivered a clean image, but if it did, there's no inter-line interference further down the chain. The lines are separate (unless you convert standards), and that's that.
Originally Posted by edDV
Star Trek TNG is a well known difficult case, because it mixes 24p film with 60i effects. The PAL conversion isn't pretty!
Cheers,
David. -
Originally Posted by 2Bdecided
Analog resolution is normally measured by a perception psychology. A test pattern with alternating black and white wedges is shot by a camera (or optical scanner) and displayed on a high resolution monitor. The point where the wedges merge to gray determines the perceived resolution of the transmission system.
Analog video has descrete vertical sampling (480-486 lines NTSC, 576 lines PAL) however that is not the resolution of the stored or transmitted video. Assuming the source camera/film transfer and processing equipment was capable of 576 line vertical resolution, interlace video suffers vertical resolution loss as expalined by Kell (Kell Factor). Actual display resolution is factored by 0.64 to 0.90 depending on type of display or the method of deinterlace and scaling for progressive display. This applies to both display and A/D capture to progressive frames. The only way to maintain full vertical resolution on analog capture is to maintain interlace.
http://en.wikipedia.org/wiki/Kell_factor
Horizontal resolution of analog video is a bandwidth issue. Assuming a good source camera can deliver 5 to 7 MHz of luminance bandwidth, the recording device and transmitter will limit luminance resolution. Typical maximum horizontal storage resolutions are:
VHS - 3MHz LPF ~240 perceived lines of horizontal resolution
1" Type C - ~5MHz ~ 400 perceived lines of horizontal resolution
Betacam SP ~5MHz ~ 400 perceived lines of horizontal resolution
An NTSC transmitter limits luminance bandwidth to 4.2 MHz (~330 lines of resolution).
PAL transmitter limits luminance bandwidth to 4.2 MHz (PAL-N/M) to 6 MHz (PAL-D)
High end PAL and NTSC studio cameras can have analog luminance bandwidths as high as 9MHz (~700 perceived lines of horizontal resolution) but there was only one analog VTR capable of recording near that bandwidth, the 2" helical IVC-9000. Wide bandwidth SD studio cameras can be upscaled for HD recording and broadcast.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by 2Bdecided
Prior to the early 80's major export TV series were either edited off-line for film release or received the dual standard on-line treatment. The 80's to 90's saw the rise of the digital standards converter. Mid 90's producers moved to 16:9 D1 or DigiBeta transfers with the goal of a future DVD release and HD re-edit. Those with the budget did the separate NTSC and PAL SD wide digital edit master. They were planning for future digital SD and HD syndication. These series go decades in syndication. HD releases could be upscaled from SD wide DigiBeta or retransferred from the original film, then edited HD.
By the 2000's most were transferring and editing HDCAM edit masters.
As for "Friends", they probably went cheap for the PAL DVD release but I'll bet they have a plan for a full 24p Blu-Ray HD remaster from the original film. Once they have the HD 24p master, they can convert that to any world broadcast standard. They'll spend the money when more of the PAL market is digital and HD capable.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by raffie
The HD resolution of today will be VHS quality of very soon future.
All those "HD" resolutions will be insufficient crap to make future SuperDuperExtraHighDefinition releases, while those good old 35mm prints from 50's/60's/70's will be still good
That's a paradox of shortsightness and overconfidence so many people have in current digital video technologies.
Just seeing this videohelp website's history in mere 10 years you can tell how fast digital video (merely for consumer market) have changed.
VCDhelp.com expanded to include SVCDs, then became dvdhelp.com and with advent of HD TV it was changed to videohelp.com which finally sticked for a while...
All in less than 10 years LOL -
Originally Posted by DereX888
Now we could have DirectTV transfer one channel at HDCAM quality or >16 channels MPeg4 @~6-9Mb/s. We have spoken. We want more channels at less quality.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by edDV
-
I keep trying to make the point that bit rate is more important than resolution for picture quality. You don't have to wait 20 years,
But, you have to remember, we have tons of members here (and those that lurk for a day or two just for a leach or of how-to's) that all they want to do is cram so much video onto the tiny-est disc..be it 1 movie or 10 movies, or 2 hours to 10 hours, etc. This is that large group that still believes that divx compressed to hell (tiny bitrate) is more than enough and ( whats that ? ) looks great on my 32" flat panel or whatever. I don't know. But I don't think we see the last of that group anytime soon. I mean, the large they make them discs (now at bluray 25g/50g) the more they want to put on it or is it, the lower the bitrate they make them to fit. I don't know.
But what's prob even worse (if it hasn't already started) is going the HDD route instead of disc. You'll never get rid of'em. So, they'll always be confusion in terms of "perceived quality" every now and then as these comments float around during forum discussions.
. . .
1. I thought that Casablanca was B/W. Did they colorize it ?
2. I wish they would start putting tv series on BR. I'm curious what bitrate strategy will be employeed as the norm.
or will they use tv series for h264 and compress them special or something. I don't know. But I wish they would hurry it up already.
. . .
Speaking of transfers, I've noticed that Gilligan's Island was Film, 3rd season dvd set. (I also have the 1st (B/W) season., and it includes the Pilot episode..wonder if these b/w's were Film shot) The dvd set I recently purchased can be clean-ly restored back to 24p..most episodes. I thought they were interlaced for tv, but then again, I never really reviewed any recently. I saw a couple that were telecine (exhibited the interlace during software play, even one that had 'brighter' color transfer) but I hadn't time to examine it to see if it was an EDITED episode that slipped in or just a bad transfer for that one. I'll have to find the time to review it and see for myself. As for the quality, they are actaully very good. No noise and the bitrate, at least the encoding was very good. I was watching them on my 19" LCD monitor..you know, the one with the bad gamma nonsense.
I have Getsmart season 2 also. I have to review those when I get the chance. I wonder if they were Film, too. Hope they were and can be clearly restored back to 24p. All these tv series could make greate low-htpc-resolution juke box archive--with the right menu system of course.
-vhelp 5073 -
Originally Posted by jagabo
I agree broadcast is bit rate starved for a given resolution but common practice for many is to recode it to much lower bit rate and still expect "HD quality".Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Also, the mpeg encoder has to be a match, in terms of exact copy of the source video but at the proposed lower bitrate. And there aren't very many clever mpeg encoders that can meet this criteria. In my eyes, I can always see the flaws (mpeg errors: there are many dif. types) after re-encoding. I mean, there is no purpose of re-encoding if all you can do is use the same bitrate as the original, assuming same specs: resolution, etc.
Why is it that we cant greatly improve videos with artifacts and bad quality coming from vhs or analog sources, and companies take old movies and turn them to full-hd with great quality? What methods do they use that are not know to public?
h264 is where it comes in to change that. But, most competitors live by basicallities: the same script or profile. Everyone copies off the other specs or profile and no wonder the results are almost always far from the mark. But at least, (thank goodness) there is x264 cli, it is the best thing out there and can be fine-tuned/tweaked beyond the competitors top $$$ encoders, and with superior results: no, i'm not a salesmen, just speaking the truth
The only way you are going to hit the mark is to become knowledged in these encoders: mpeg and h264
Also, tooling. You have to develop the skills to 'tool' around with the encodes in an effort to perfect them or achive near likeness of the original source, but without copying the same bitrate, else your efforts are in vain. Don't go by what you hear and read on these boards as you attempt to (copy) transfer video through re-encoding. There is the greater chance that you are receiving user-perceived quality hype. Video sources vary from content to content, provider to provider, tv show/movie/series to show/movie/series, air times to air times, and so on and so forth, that no one single template or profile or what-have-you will guarintee better results. At least this is what I have learnt over the years. And HD / HDTV is no different. Only bigger picture and more starved bitrates, etc.
-vhelp 5074 -
Casablanca is Black and White.
They are releasing TV series to BD. Band of Brothers looks fantastic at 1080, but you only get two episodes to a disc. I haven't looked at any other TV series, although I have seen them on the shelves (24/Prison Break/Supernatural are all available). I don't know how much they are squeezing on to them.Read my blog here.
-
Originally Posted by vhelp
2. Most will use H.264 or VC1 to put a full SD season on one 50GB disk. To to this with space efficiency they would re-edit to 24p. If they use the telecined version, h.264 interlace needs further development or VC1 needs more interlace bit rate efficiency. HD series will need those 4 layer discs to get cost competitive but most of those have an easier film to 24p video path (less film restoration).
Originally Posted by vhelp
Originally Posted by vhelp
Top end Quad Tape player from those days. Ampex VR-2000
Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
edDV - I know all this stuff!
I didn't mention horizontal resolution at all, but vertically, you do have 576 lines in the digital sense - in the optical (traditional) sense, yes, less. No argument.
Originally Posted by edDV
The loss of resolution due to interlacing (which is debatable - it's more the requirement to pre-filter to prevent inter-line twitter on an interlaced system) is an extra factor on top.
Yes, I know, you can find 100 books and 10000 website that say kell factor = resolution loss due to interlacing.
Oh look - amazingly wikipedia is correct! "Kell factor is sometimes incorrectly stated to exist to account for the effects of interlacing".
Cheers,
David. -
Originally Posted by edDV
...major export TV series were either edited off-line for film release or received the dual standard on-line treatment. The 80's to 90's saw the rise of the digital standards converter. Mid 90's producers moved to 16:9 D1 or DigiBeta transfers with the goal of a future DVD release and HD re-edit. Those with the budget did the separate NTSC and PAL SD wide digital edit master. They were planning for future digital SD and HD syndication. These series go decades in syndication. HD releases could be upscaled from SD wide DigiBeta or retransferred from the original film, then edited HD.
By the 2000's most were transferring and editing HDCAM edit masters.
As for "Friends", they probably went cheap for the PAL DVD release but I'll bet they have a plan for a full 24p Blu-Ray HD remaster from the original film. Once they have the HD 24p master, they can convert that to any world broadcast standard. They'll spend the money when more of the PAL market is digital and HD capable.
It's obvious watching in the UK that they took the cheap route - despite it being a very successful series. Digital TV or not (and millions have been watching digital TV in the UK since 1999 - with the majority watching digitally since the the early 2000s) it looks really crappy in PAL. It's not even a good conversion.
You've got me wondering now though - maybe the series that I always assumed were edited on film (because they look fine in the UK - better than they do in the USA, because our digital SD isn't usually so overcompressed, and of course our SD analogue was always better) - maybe they were dual edited on NTSC and PAL video.
I can certainly remember series shot on video (through the 1970s, 1980s, and 1990s) that looked soft and juddery when shown over here. I have a feeling though that at least some series shot on film have had the same problem. The standards conversion used at the time (1980s / 1990s) wasn't very good at all.
Cheers,
David. -
Originally Posted by 2Bdecided
Many computer display cards don't low pass filter interlace (S-Video and analog component) outputs and the result is interfield flicker in picture areas with low motion.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about
Similar Threads
-
High definition DVD
By rene-rottingham in forum Newbie / General discussionsReplies: 13Last Post: 21st May 2010, 06:35 -
High definition conversion
By kypreo in forum Video ConversionReplies: 10Last Post: 21st Mar 2010, 22:15 -
Streaming High Definition on your own Server
By Priapism in forum Video Streaming DownloadingReplies: 11Last Post: 2nd Sep 2009, 02:32 -
High Definition on DVD
By Dem Pyros in forum Authoring (Blu-ray)Replies: 24Last Post: 2nd Sep 2008, 08:48 -
video 8 to high definition
By devdev in forum Camcorders (DV/HDV/AVCHD/HD)Replies: 4Last Post: 10th Jan 2008, 13:31