VideoHelp Forum
+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 40
Thread
  1. Why is it that we cant greatly improve videos with artifacts and bad quality coming from vhs or analog sources, and companies take old movies and turn them to full-hd with great quality? What methods do they use that are not know to public?
    Quote Quote  
  2. Member craigarta's Avatar
    Join Date
    Jun 2001
    Location
    Cascade Mountains
    Search Comp PM
    The quality of the old movies is actually quite good. It is just limitations of DVD and especially VHS
    Quote Quote  
  3. Always Watching guns1inger's Avatar
    Join Date
    Apr 2004
    Location
    Miskatonic U
    Search Comp PM
    Casablanca has been recently released on Bluray in full 1080p wonder. How did they do it ?

    The started with original 35mm prints. They scanned it frame by frame at 4k x 4k resolution. They then hand visited every frame to paint out scratches and marks. They they rebalanced and graded the footage. Finally they resize it down to 1080p resolution and encoded it for Bluray.

    Believe me, if the studios started out with a clip from youtube, it wouldn't look any better than what you or I could do.

    The studios have better quality source material - i.e. actual prints - not low resolution VHS tapes. They have high quality scanners than can scan a 35 frame so it is over 4000 x 4000 pixels - not 720 x 576. They have teams of trained people hand painting and repairing frame by frame - not one impatient person using a handful of virtualdub filters. They have access to records and people who were there when these films were created. They understand about film stock, and how certain techniques produce certain outcomes. And they have money. The restoration of Casablanca cost somewhere north of USD$1,000,000

    There is no secret in how to do it properly.
    Read my blog here.
    Quote Quote  
  4. That was a fine answer, you said exactly what i needed to hear. Even the youtube statement answered something that puzzled me, cause I was wondering if they could take a youtube video and make it 1080p but you even answered that.

    Thanx man.
    Quote Quote  
  5. What is quite interesting is that a film from the 40's or a TV series from the 60's or 70's, being shot and edited on film, is available in very high quality and thus perfectly suitable for High Def. Whereas late 80's/90's TV-series were edited on standard def video, being low-res crap quality complete with colour-wash, and every and all other video artefacting.

    Kind off paradoxical innit?
    Quote Quote  
  6. Always Watching guns1inger's Avatar
    Join Date
    Apr 2004
    Location
    Miskatonic U
    Search Comp PM
    Originally Posted by raffie
    What is quite interesting is that a film from the 40's or a TV series from the 60's or 70's, being shot and edited on film, is available in very high quality and thus perfectly suitable for High Def. Whereas late 80's/90's TV-series were edited on standard def video, being low-res crap quality complete with colour-wash, and every and all other video artefacting.

    Kind off paradoxical innit?
    The cost of mass production.
    Read my blog here.
    Quote Quote  
  7. Banned
    Join Date
    Oct 2004
    Location
    Freedonia
    Search Comp PM
    raffie - Actually recording to video tape for TV shows was done by some in the 1970s. Some were filmed, some were videotaped. All In The Family, for example, was the first show that I know of to be videotaped. A lot of 30 minute sitcoms (situation comedies) from the 70s were videotaped. As guns1inger says, I think videotape became popular because it lowered costs.
    Quote Quote  
  8. Originally Posted by guns1inger
    Originally Posted by raffie
    What is quite interesting is that a film from the 40's or a TV series from the 60's or 70's, being shot and edited on film, is available in very high quality and thus perfectly suitable for High Def. Whereas late 80's/90's TV-series were edited on standard def video, being low-res crap quality complete with colour-wash, and every and all other video artefacting.

    Kind off paradoxical innit?
    The cost of mass production.
    Eh, I'm not sure what u mean by this? I am talking about TV-shows filming/editing/mastering on video versus film. Mass production of what?

    But yes, offcourse it reduced costs of the production.

    Today video is shot digitally in high definition. Still pretty amazing there will be this 'gap' in production quality.
    I'm not sure of wich ones by heart but many TV-shows I recall from the eighties clearly showed film artefacting (A-Team, Star Trek TNG, ...)
    Quote Quote  
  9. What's amazing is that in another 20 years with these old films is that they'll be able to take the raw files and/or 35MM prints that remain in good condition and make them look even better on the next generation of video technology. There is still some quality to squeeze out of those 35mm frames.
    Quote Quote  
  10. Member
    Join Date
    Jan 2006
    Location
    Northern Pacific SW
    Search Comp PM
    Image capture on film does not automatically mean high quality.

    If you examined a frame from a pristine negative of an unrestored Casablanca, I think most people would be surprised at the amount of grain and general softness of the image.

    50's thru 90's shows shot on film are even worse. When you're shooting a half hour show every week, you didn't have time to get much of anything perfect.

    Videotape was first used in studio-based set pieces (sitcoms). Film was (and still is, to a lesser degree) more portable and less technologically intensive than videotape (cheaper). ENG is the exception here.

    Most of the savings that producers were to realize from HD Video evaporated when increased costs for lighting were factored in.

    I like to watch old movies on some of the throwaway HD channels on Dish. The studios obviously just transferred the best release print they could find. They really have the film feel - warts and all.
    Quote Quote  
  11. Member zoobie's Avatar
    Join Date
    Feb 2005
    Location
    Florida
    Search Comp PM
    The Twilight Zone was one of if not thee first to employ the primitive videotape for entertainment. Serling's amazing stories were suddenly confined to indoor plays complete with rehearsal and camera blocking. Needless to say, the harsh, shallow, B & W result was found wanting and quickly abandoned for his purposes...
    Quote Quote  
  12. One item everyone seems to be sliding past is that film does NOT have a resolution issue, while Videotape does. Yes, everyone seems to be recognizing the qualitative difference, but the fact is that film is not a set number of pixels in width and height.

    Film can be scanned at any resolution the user desires. Just as a comparative example, 3D graphics designed to be shown as part of a Feature film are rendered at resolutions around 4000 X 3000(as opposed to DV video or DVD which is 720X480).

    So, imagine scanning film at 4000X2400, and then running filters to remove grain (2D and 3D noisefiltering). You end up with an image that can be scaled down to 1920X1080 (which also tends to blend out noise), resulting in a pristine image. This can also be sharpened at the 4K resolution, providing better clarity when scaled down.

    Videotape, on the other hand, has a resolution of 640(720) X 480. Now pro grade has a higher resolution, since it IS analog, not digital, but the smallest details resolvable set the realistic limit to around 1000X700. Take something with that resolution, capture it at 1920X1080 (which is a re4sizing in the first place), and you get an image that is already at the limit of resolution and any cleaning/filtering will have to be done at that limit. So any sharpening must be done at 1920, resulting in harsher ringing/haloing than sharpening film. So you either don't sharpen, or do the best you can with what you have; knowing it will never be as clear as film.

    Videotape shows DO look better at HD than SD, since the master videotape was better. But it will never match the sheer visible details of film.
    Quote Quote  
  13. Member
    Join Date
    Jan 2006
    Location
    Northern Pacific SW
    Search Comp PM
    Originally Posted by mpiper
    One item everyone seems to be sliding past is that film does NOT have a resolution issue, while Videotape does. Yes, everyone seems to be recognizing the qualitative difference, but the fact is that film is not a set number of pixels in width and height.

    Film can be scanned at any resolution the user desires. Just as a comparative example, 3D graphics designed to be shown as part of a Feature film are rendered at resolutions around 4000 X 3000(as opposed to DV video or DVD which is 720X480).

    So, imagine scanning film at 4000X2400, and then running filters to remove grain (2D and 3D noisefiltering). You end up with an image that can be scaled down to 1920X1080 (which also tends to blend out noise), resulting in a pristine image. This can also be sharpened at the 4K resolution, providing better clarity when scaled down.

    Videotape, on the other hand, has a resolution of 640(720) X 480. Now pro grade has a higher resolution, since it IS analog, not digital, but the smallest details resolvable set the realistic limit to around 1000X700. Take something with that resolution, capture it at 1920X1080 (which is a re4sizing in the first place), and you get an image that is already at the limit of resolution and any cleaning/filtering will have to be done at that limit. So any sharpening must be done at 1920, resulting in harsher ringing/haloing than sharpening film. So you either don't sharpen, or do the best you can with what you have; knowing it will never be as clear as film.

    Videotape shows DO look better at HD than SD, since the master videotape was better. But it will never match the sheer visible details of film.
    I found this post confusing.

    As a practical matter, film definitely has limits as to resolution. Aperture, shutter speed, lens, lighting, film stock, developing and printing all affect resolution.

    I've read an article about scanning film with electrons (or some other particle / sub-particle) which revealed that normally exposed film emulsion unexpectedly contains 3D information. Not widely available or terribly practical for our purposes, but interesting.

    The film scanning that I'm most familiar with is 2K (2048×1536 pixels) and 4K (4096×3072 pixels).

    When you use the term "videotape", it seems you are referring only to analog videotape. Digibeta and other formats are recorded on videotape digitally.

    We are supposed to compare resolution based on "pixels", yet it is a term that explicitly refers to digital images. Analog images (film or video) become pixels when they are digitized.

    Lastly, videotaped shows do not necessarily look better at HD resolutions. It depends on the era, equipment used and tape condition, among other factors.

    Can you clarify?

    To zoobie: I didn't know that Rod Sterling had experimented with videotape in the first Twilight Zone series - cool!
    Quote Quote  
  14. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by raffie
    Originally Posted by guns1inger
    Originally Posted by raffie
    What is quite interesting is that a film from the 40's or a TV series from the 60's or 70's, being shot and edited on film, is available in very high quality and thus perfectly suitable for High Def. Whereas late 80's/90's TV-series were edited on standard def video, being low-res crap quality complete with colour-wash, and every and all other video artefacting.

    Kind off paradoxical innit?
    The cost of mass production.
    Eh, I'm not sure what u mean by this? I am talking about TV-shows filming/editing/mastering on video versus film. Mass production of what?

    But yes, offcourse it reduced costs of the production.

    Today video is shot digitally in high definition. Still pretty amazing there will be this 'gap' in production quality.
    I'm not sure of wich ones by heart but many TV-shows I recall from the eighties clearly showed film artefacting (A-Team, Star Trek TNG, ...)
    The vast majority of TV series were shot on film, transferred and edited twice to video for SD NTSC and SD PAL markets. In most cases effects were created during video editing (e.g. Star Trek TNG). For HD release, the original film must be scanned then effects and editing redone at HD resolution.

    Variety shows (e.g. Carol Burnett, Rowan and Martin, Saturday Night Live, etc.) and some sitcoms were shot in multi-camera video. These shows are upscaled and image processed for HD release. Shows with strong international export demand were shot on film for better multi-format distribution.

    HD video production began in the late 1990s but didn't become popular until the mid 2000's. Most TV series are still shot on film. 1080p/24 HD production began a few years ago. This allows multi format distribution same as shooting film.

    The newest live production trucks for events are multifomat allowing HD or SD production at 23.976, 24, 25, 29.97, 50 or 59.94 frames per second.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  15. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Just to add my $0.02...

    Analog videotape has resolution that is determined by:
    A. Scanlines (525/480 and 625/576) for vertical, but that is the theoretical maximum; it is further limited by playback head bandwidth, interchange, processing electronics, and whether the connection to A/D converter is component/YC/Composite, so the general effective resolution is probably 60-80% of that (although old VHS is MUCH worse).
    B. Bandwidth, for horizontal (usually ~4-6MHz for broadcast, less for older consumer tape)-this roughly corresponds to what we generally regard as 4/3rds of the vertical (hey, those numbers look familiar!) or ~640-768; it is further limited the same way as the vertical

    That's all for SD (there is analog HD, but it was mainly only used in Japan).

    Digital, we all know the numbers for.

    Film -DOES- have an effective resolution based on it's grain size (which is determined by it's ISO/speed and chemical technology), but it is possible to surpass that limit with "De-graining" (can increase perceivalbe resolution similar to the way dither can increase perceiveable audio dynamic range). Turns out though that the "grain" is often appreciated, so it's often kept in (or added back).

    Scott
    Quote Quote  
  16. Member 2Bdecided's Avatar
    Join Date
    Nov 2007
    Location
    United Kingdom
    Search Comp PM
    There's so much speculation in this thread that's completely divorced from reality!

    Analogue video has a fixed number of active lines. No more (not 700 mpiper!), no less (not 60-80% of that Cornucopia). The camera and electronics might not have delivered a clean image, but if it did, there's no inter-line interference further down the chain. The lines are separate (unless you convert standards), and that's that.


    Originally Posted by edDV
    The vast majority of TV series were shot on film, transferred and edited twice to video for SD NTSC and SD PAL markets.
    No no no no! Shows that were edited as NTSC video are run through a standards converter to convert NTSC video to PAL video. There's (almost) no double editing. The PAL versions of Star Trek TNG (and even "Friends") are visibly softer than "proper" PAL shows, and visibly converted from NTSC video.

    Star Trek TNG is a well known difficult case, because it mixes 24p film with 60i effects. The PAL conversion isn't pretty!

    Cheers,
    David.
    Quote Quote  
  17. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by 2Bdecided
    There's so much speculation in this thread that's completely divorced from reality!

    Analogue video has a fixed number of active lines. No more (not 700 mpiper!), no less (not 60-80% of that Cornucopia). The camera and electronics might not have delivered a clean image, but if it did, there's no inter-line interference further down the chain. The lines are separate (unless you convert standards), and that's that.
    The thread is unfocused but if you want to dig in on analog resolution let's go. It has been discussed in detail in these threads several times.

    Analog resolution is normally measured by a perception psychology. A test pattern with alternating black and white wedges is shot by a camera (or optical scanner) and displayed on a high resolution monitor. The point where the wedges merge to gray determines the perceived resolution of the transmission system.



    Analog video has descrete vertical sampling (480-486 lines NTSC, 576 lines PAL) however that is not the resolution of the stored or transmitted video. Assuming the source camera/film transfer and processing equipment was capable of 576 line vertical resolution, interlace video suffers vertical resolution loss as expalined by Kell (Kell Factor). Actual display resolution is factored by 0.64 to 0.90 depending on type of display or the method of deinterlace and scaling for progressive display. This applies to both display and A/D capture to progressive frames. The only way to maintain full vertical resolution on analog capture is to maintain interlace.
    http://en.wikipedia.org/wiki/Kell_factor

    Horizontal resolution of analog video is a bandwidth issue. Assuming a good source camera can deliver 5 to 7 MHz of luminance bandwidth, the recording device and transmitter will limit luminance resolution. Typical maximum horizontal storage resolutions are:

    VHS - 3MHz LPF ~240 perceived lines of horizontal resolution
    1" Type C - ~5MHz ~ 400 perceived lines of horizontal resolution
    Betacam SP ~5MHz ~ 400 perceived lines of horizontal resolution

    An NTSC transmitter limits luminance bandwidth to 4.2 MHz (~330 lines of resolution).
    PAL transmitter limits luminance bandwidth to 4.2 MHz (PAL-N/M) to 6 MHz (PAL-D)

    High end PAL and NTSC studio cameras can have analog luminance bandwidths as high as 9MHz (~700 perceived lines of horizontal resolution) but there was only one analog VTR capable of recording near that bandwidth, the 2" helical IVC-9000. Wide bandwidth SD studio cameras can be upscaled for HD recording and broadcast.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  18. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by 2Bdecided
    Originally Posted by edDV
    The vast majority of TV series were shot on film, transferred and edited twice to video for SD NTSC and SD PAL markets.
    No no no no! Shows that were edited as NTSC video are run through a standards converter to convert NTSC video to PAL video. There's (almost) no double editing. The PAL versions of Star Trek TNG (and even "Friends") are visibly softer than "proper" PAL shows, and visibly converted from NTSC video.

    Star Trek TNG is a well known difficult case, because it mixes 24p film with 60i effects. The PAL conversion isn't pretty!

    Cheers,
    David.
    We are mixing decades and workflows but yes, Star Trek TNG didn't have the budget for a separate PAL edit so it was standards converted.

    Prior to the early 80's major export TV series were either edited off-line for film release or received the dual standard on-line treatment. The 80's to 90's saw the rise of the digital standards converter. Mid 90's producers moved to 16:9 D1 or DigiBeta transfers with the goal of a future DVD release and HD re-edit. Those with the budget did the separate NTSC and PAL SD wide digital edit master. They were planning for future digital SD and HD syndication. These series go decades in syndication. HD releases could be upscaled from SD wide DigiBeta or retransferred from the original film, then edited HD.

    By the 2000's most were transferring and editing HDCAM edit masters.

    As for "Friends", they probably went cheap for the PAL DVD release but I'll bet they have a plan for a full 24p Blu-Ray HD remaster from the original film. Once they have the HD 24p master, they can convert that to any world broadcast standard. They'll spend the money when more of the PAL market is digital and HD capable.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  19. Banned
    Join Date
    Aug 2002
    Location
    beautiful
    Search Comp PM
    Originally Posted by raffie
    Today video is shot digitally in high definition. Still pretty amazing there will be this 'gap' in production quality.
    and in 20 years people will look at them same as we look at videotaped shows from 80's and 90's.
    The HD resolution of today will be VHS quality of very soon future.
    All those "HD" resolutions will be insufficient crap to make future SuperDuperExtraHighDefinition releases, while those good old 35mm prints from 50's/60's/70's will be still good

    That's a paradox of shortsightness and overconfidence so many people have in current digital video technologies.

    Just seeing this videohelp website's history in mere 10 years you can tell how fast digital video (merely for consumer market) have changed.
    VCDhelp.com expanded to include SVCDs, then became dvdhelp.com and with advent of HD TV it was changed to videohelp.com which finally sticked for a while...
    All in less than 10 years LOL
    Quote Quote  
  20. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by DereX888
    Originally Posted by raffie
    Today video is shot digitally in high definition. Still pretty amazing there will be this 'gap' in production quality.
    and in 20 years people will look at them same as we look at videotaped shows from 80's and 90's.
    The HD resolution of today will be VHS quality of very soon future.
    I keep trying to make the point that bit rate is more important than resolution for picture quality. You don't have to wait 20 years, just view a CSI or other program on a native 144Mb/s HDCAM playback on a good monitor vs ~16Mb/s MPeg2 as broadcast. Blu-Ray is somewhat better at ~25Mb/s (35Mb/s max). Digital Cinema 2kx1k HDCAM-SR records 10bit RGB at 440 or 880 Mb/s.

    Now we could have DirectTV transfer one channel at HDCAM quality or >16 channels MPeg4 @~6-9Mb/s. We have spoken. We want more channels at less quality.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  21. Originally Posted by edDV
    I keep trying to make the point that bit rate is more important than resolution for picture quality.
    Neither is more important. They need to be balanced. But yes, broadcast, cable, and satellite delivery is usually bitrate starved.
    Quote Quote  
  22. Always Watching guns1inger's Avatar
    Join Date
    Apr 2004
    Location
    Miskatonic U
    Search Comp PM
    I want less channels and more quality.

    However you need more channels just to cater for all the CSI variants
    Read my blog here.
    Quote Quote  
  23. Member vhelp's Avatar
    Join Date
    Mar 2001
    Location
    New York
    Search Comp PM
    I keep trying to make the point that bit rate is more important than resolution for picture quality. You don't have to wait 20 years,
    I've been saying the same thing for many years on this forum board.

    But, you have to remember, we have tons of members here (and those that lurk for a day or two just for a leach or of how-to's) that all they want to do is cram so much video onto the tiny-est disc..be it 1 movie or 10 movies, or 2 hours to 10 hours, etc. This is that large group that still believes that divx compressed to hell (tiny bitrate) is more than enough and ( whats that ? ) looks great on my 32" flat panel or whatever. I don't know. But I don't think we see the last of that group anytime soon. I mean, the large they make them discs (now at bluray 25g/50g) the more they want to put on it or is it, the lower the bitrate they make them to fit. I don't know.

    But what's prob even worse (if it hasn't already started) is going the HDD route instead of disc. You'll never get rid of'em. So, they'll always be confusion in terms of "perceived quality" every now and then as these comments float around during forum discussions.

    . . .

    1. I thought that Casablanca was B/W. Did they colorize it ?
    2. I wish they would start putting tv series on BR. I'm curious what bitrate strategy will be employeed as the norm.
    or will they use tv series for h264 and compress them special or something. I don't know. But I wish they would hurry it up already.

    . . .

    Speaking of transfers, I've noticed that Gilligan's Island was Film, 3rd season dvd set. (I also have the 1st (B/W) season., and it includes the Pilot episode..wonder if these b/w's were Film shot) The dvd set I recently purchased can be clean-ly restored back to 24p..most episodes. I thought they were interlaced for tv, but then again, I never really reviewed any recently. I saw a couple that were telecine (exhibited the interlace during software play, even one that had 'brighter' color transfer) but I hadn't time to examine it to see if it was an EDITED episode that slipped in or just a bad transfer for that one. I'll have to find the time to review it and see for myself. As for the quality, they are actaully very good. No noise and the bitrate, at least the encoding was very good. I was watching them on my 19" LCD monitor..you know, the one with the bad gamma nonsense.

    I have Getsmart season 2 also. I have to review those when I get the chance. I wonder if they were Film, too. Hope they were and can be clearly restored back to 24p. All these tv series could make greate low-htpc-resolution juke box archive--with the right menu system of course.

    -vhelp 5073
    Quote Quote  
  24. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by jagabo
    Originally Posted by edDV
    I keep trying to make the point that bit rate is more important than resolution for picture quality.
    Neither is more important. They need to be balanced. But yes, broadcast, cable, and satellite delivery is usually bitrate starved.
    Broadcast resolution is more than adequate. This was done by plan. They decided to keep 6MHz channels with bit starved 1920x1080 and let codecs improve over time.

    I agree broadcast is bit rate starved for a given resolution but common practice for many is to recode it to much lower bit rate and still expect "HD quality".
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  25. Member vhelp's Avatar
    Join Date
    Mar 2001
    Location
    New York
    Search Comp PM
    Also, the mpeg encoder has to be a match, in terms of exact copy of the source video but at the proposed lower bitrate. And there aren't very many clever mpeg encoders that can meet this criteria. In my eyes, I can always see the flaws (mpeg errors: there are many dif. types) after re-encoding. I mean, there is no purpose of re-encoding if all you can do is use the same bitrate as the original, assuming same specs: resolution, etc.

    Why is it that we cant greatly improve videos with artifacts and bad quality coming from vhs or analog sources, and companies take old movies and turn them to full-hd with great quality? What methods do they use that are not know to public?
    This is where the encoder comes in. That is the key element to your experience, in terms of lack of quality.

    h264 is where it comes in to change that. But, most competitors live by basicallities: the same script or profile. Everyone copies off the other specs or profile and no wonder the results are almost always far from the mark. But at least, (thank goodness) there is x264 cli, it is the best thing out there and can be fine-tuned/tweaked beyond the competitors top $$$ encoders, and with superior results: no, i'm not a salesmen, just speaking the truth

    The only way you are going to hit the mark is to become knowledged in these encoders: mpeg and h264

    Also, tooling. You have to develop the skills to 'tool' around with the encodes in an effort to perfect them or achive near likeness of the original source, but without copying the same bitrate, else your efforts are in vain. Don't go by what you hear and read on these boards as you attempt to (copy) transfer video through re-encoding. There is the greater chance that you are receiving user-perceived quality hype. Video sources vary from content to content, provider to provider, tv show/movie/series to show/movie/series, air times to air times, and so on and so forth, that no one single template or profile or what-have-you will guarintee better results. At least this is what I have learnt over the years. And HD / HDTV is no different. Only bigger picture and more starved bitrates, etc.

    -vhelp 5074
    Quote Quote  
  26. Always Watching guns1inger's Avatar
    Join Date
    Apr 2004
    Location
    Miskatonic U
    Search Comp PM
    Casablanca is Black and White.

    They are releasing TV series to BD. Band of Brothers looks fantastic at 1080, but you only get two episodes to a disc. I haven't looked at any other TV series, although I have seen them on the shelves (24/Prison Break/Supernatural are all available). I don't know how much they are squeezing on to them.
    Read my blog here.
    Quote Quote  
  27. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by vhelp

    1. I thought that Casablanca was B/W. Did they colorize it ?
    2. I wish they would start putting tv series on BR. I'm curious what bitrate strategy will be employeed as the norm.
    or will they use tv series for h264 and compress them special or something. I don't know. But I wish they would hurry it up already.
    1. Casablanca was B/W and 1.37:1 aspect and only 102 min. That means they can devote 33% more bit rate to luminance vs. 4:2:0 and push the upper 35Mb/s limits for Blu-Ray. That makes great quality.

    2. Most will use H.264 or VC1 to put a full SD season on one 50GB disk. To to this with space efficiency they would re-edit to 24p. If they use the telecined version, h.264 interlace needs further development or VC1 needs more interlace bit rate efficiency. HD series will need those 4 layer discs to get cost competitive but most of those have an easier film to 24p video path (less film restoration).

    Originally Posted by vhelp
    Speaking of transfers, I've noticed that Gilligan's Island was Film, 3rd season dvd set. (I also have the 1st (B/W) season., and it includes the Pilot episode..wonder if these b/w's were Film shot) The dvd set I recently purchased can be clean-ly restored back to 24p..most episodes. I thought they were interlaced for tv, but then again, I never really reviewed any recently. I saw a couple that were telecine (exhibited the interlace during software play, even one that had 'brighter' color transfer) but I hadn't time to examine it to see if it was an EDITED episode that slipped in or just a bad transfer for that one. I'll have to find the time to review it and see for myself. As for the quality, they are actaully very good. No noise and the bitrate, at least the encoding was very good. I was watching them on my 19" LCD monitor..you know, the one with the bad gamma nonsense.
    These were all shot and edited on film in those days but telecine transferred to Quad video tape after editing. They can get HD if they go back to the original film. Otherwise they will use the old Quad tape masters from 1964-67.


    Originally Posted by vhelp
    I have Getsmart season 2 also. I have to review those when I get the chance. I wonder if they were Film, too. Hope they were and can be clearly restored back to 24p. All these tv series could make greate low-htpc-resolution juke box archive--with the right menu system of course.

    -vhelp 5073
    Get Smart would be ~1966 so probably cut on film. There may be some extras from TV variety/talk shows. Those would be 486i off Quad tape.

    Top end Quad Tape player from those days. Ampex VR-2000
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  28. Member 2Bdecided's Avatar
    Join Date
    Nov 2007
    Location
    United Kingdom
    Search Comp PM
    edDV - I know all this stuff! I didn't mention horizontal resolution at all, but vertically, you do have 576 lines in the digital sense - in the optical (traditional) sense, yes, less. No argument.

    Originally Posted by edDV
    ...interlace video suffers vertical resolution loss as expalined by Kell (Kell Factor)...
    You know that's a common misconception. The Kell factor applies to all video captured with finite sensors - it's down to the sinc frequency roll off which stops you approaching the Nyquist limit cleanly.

    The loss of resolution due to interlacing (which is debatable - it's more the requirement to pre-filter to prevent inter-line twitter on an interlaced system) is an extra factor on top.

    Yes, I know, you can find 100 books and 10000 website that say kell factor = resolution loss due to interlacing.

    Oh look - amazingly wikipedia is correct! "Kell factor is sometimes incorrectly stated to exist to account for the effects of interlacing".

    Cheers,
    David.
    Quote Quote  
  29. Member 2Bdecided's Avatar
    Join Date
    Nov 2007
    Location
    United Kingdom
    Search Comp PM
    Originally Posted by edDV
    We are mixing decades and workflows but yes, Star Trek TNG didn't have the budget for a separate PAL edit so it was standards converted.

    Prior to the early 80's...
    Ah, you're going back almost before I was born! Still, I've seen some of this stuff repeated in the UK...

    ...major export TV series were either edited off-line for film release or received the dual standard on-line treatment. The 80's to 90's saw the rise of the digital standards converter. Mid 90's producers moved to 16:9 D1 or DigiBeta transfers with the goal of a future DVD release and HD re-edit. Those with the budget did the separate NTSC and PAL SD wide digital edit master. They were planning for future digital SD and HD syndication. These series go decades in syndication. HD releases could be upscaled from SD wide DigiBeta or retransferred from the original film, then edited HD.

    By the 2000's most were transferring and editing HDCAM edit masters.

    As for "Friends", they probably went cheap for the PAL DVD release but I'll bet they have a plan for a full 24p Blu-Ray HD remaster from the original film. Once they have the HD 24p master, they can convert that to any world broadcast standard. They'll spend the money when more of the PAL market is digital and HD capable.
    I'm sure they could do better with Friends, since it was shot on film. It was edited on video though, and they archive the video, EDL, and film.

    It's obvious watching in the UK that they took the cheap route - despite it being a very successful series. Digital TV or not (and millions have been watching digital TV in the UK since 1999 - with the majority watching digitally since the the early 2000s) it looks really crappy in PAL. It's not even a good conversion.


    You've got me wondering now though - maybe the series that I always assumed were edited on film (because they look fine in the UK - better than they do in the USA, because our digital SD isn't usually so overcompressed, and of course our SD analogue was always better ) - maybe they were dual edited on NTSC and PAL video.

    I can certainly remember series shot on video (through the 1970s, 1980s, and 1990s) that looked soft and juddery when shown over here. I have a feeling though that at least some series shot on film have had the same problem. The standards conversion used at the time (1980s / 1990s) wasn't very good at all.

    Cheers,
    David.
    Quote Quote  
  30. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by 2Bdecided
    edDV - I know all this stuff! I didn't mention horizontal resolution at all, but vertically, you do have 576 lines in the digital sense - in the optical (traditional) sense, yes, less. No argument.

    Originally Posted by edDV
    ...interlace video suffers vertical resolution loss as expalined by Kell (Kell Factor)...
    You know that's a common misconception. The Kell factor applies to all video captured with finite sensors - it's down to the sinc frequency roll off which stops you approaching the Nyquist limit cleanly.

    The loss of resolution due to interlacing (which is debatable - it's more the requirement to pre-filter to prevent inter-line twitter on an interlaced system) is an extra factor on top.

    Yes, I know, you can find 100 books and 10000 website that say kell factor = resolution loss due to interlacing.

    Oh look - amazingly wikipedia is correct! "Kell factor is sometimes incorrectly stated to exist to account for the effects of interlacing".

    Cheers,
    David.
    What you said was directly from my reference and I know all that too. The "interlace" loss occurs from bandpass filtering (before the transmitter matrix) prior to transmission or in device NTSC/PAL encoders. For progressive digital displays it comes down to input deinterlace motion detection limits and the vertical scaling algorithm.

    Many computer display cards don't low pass filter interlace (S-Video and analog component) outputs and the result is interfield flicker in picture areas with low motion.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!