VideoHelp Forum
+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 57
Thread
  1. Member
    Join Date
    Feb 2013
    Location
    United States
    Search PM
    those shot on videotape?

    If the show was shot on film, wouldn't it end up the same resolution, since it had to be telecined to videotape in order to be broadcast anyway?
    Quote Quote  
  2. Member
    Join Date
    Sep 2012
    Location
    Australia
    Search Comp PM
    Two possibilities.

    1: The actual original film would have a MUCH higher resolution than SD video. If you can recover that, you can make a blu ray from it.

    2: film is 24 fps. Video is either 60 or 50 fields per second. When converted to video it's necessary for a single film frame to occupy at least two fields, which means AN ENTIRE frame exists in the stream. Video however, is constructed from fields, each comprising of only the even or odd lines of any particular time frame. HALF of the resolution of a single time frame is lost in a video stream and no deinterlacing scheme can fully recover that, whereas a full frame can be recovered from a telecined stream.
    Quote Quote  
  3. Member awgie's Avatar
    Join Date
    Sep 2008
    Location
    Lanarkshire, Scotland
    Search PM
    The word "resolution" is a misnomer. Since film is a photographic medium, it does not have a "resolution", in the sense of pixels. It may have different densities of the emulsion used in the film, and may have several different layers of emulsions for greater colour clarity. But it only has resolution (pixels) once it has been converted into a digital signal and stored on some form of electronic media, such as videotape.
    Do or do not. There is no "try." - Yoda
    Quote Quote  
  4. Shooting TV shows on film was very common in the early days of television. They were often shot on 35mm movie film because Hollywood had the infrastructure to shoot, develop, and edit this film. When the show was ready to air, the film was put onto a film chain which did the conversion to video. In the early days this was done in real time because videotape had not yet been invented.

    Film of this size can capture far more detail than NTSC or PAL video. However, the temporal "resolution" of film is far less (24 events per second rather than 50 or 60). So, you don't get the feel of video.

    Now that we have ubiquitous HD video, these old films can be scanned at HD or 4K resolution and, when then processed with the proper software, can look like they were shot with modern equipment. The recent broadcast of two "Dick Van Dyke Show" episodes from the early 1960s shows what can be done. These were also colorized (which I did not like at all), but if you saw them, you will appreciate how sharp and crisp they looked compared to what they looked like before. I saw many of them when they were originally broadcast, and it is quite a thrill to see all the new detail, and to see all the old film artifacts (dirt and grain) removed.

    "Seinfeld," "Charlies Angels" and many other TV shows from the more recent past have also been scanned in HD or 4K, resulting in far more detail than anything we saw back when they originally aired. In addition, many of these shows were shot using equipment that was closer to current-day widescreen. The edges of the film were cropped when broadcast in 4:3 aspect ratio. When scanned for HD, the film is scanned all the way to the edge. A little cropping is then applied to the top and bottom, and the result is video that has the current-day 16:9 aspect ratio. For those of use who saw these shows in the old square format, it is a little disconcerting to see them in widescreen.

    https://www.youtube.com/watch?v=PFIrsitJW5M
    Last edited by johnmeyer; 5th Jan 2017 at 02:26. Reason: added video link
    Quote Quote  
  5. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    I have read explanations here at VideoHelp which said something like...

    Video stored on tape in analog form contains discrete number of lines carrying picture information, so that translates to its vertical resolution. ...and although it doesn't have a true horizontal resolution, there is a point at which additional sampling in the horizontal dimension for conversion to digital format fails to capture more picture detail. That determines the horizontal resolution, but additional sampling may be done in order to meet a standard.

    For film, there is a point at which additional sampling in both the horizontal dimension and the vertical dimension for conversion to digital format fails to capture more detail. Those limits determine the horizontal resolution and vertical resolution, but additional sampling may be done in order to meet a standard.

    The frames in the type of film used for theatrical movies and TV is large enough and fine-grained enough to allow capturing digital picture information at a higher resolution than is possible for video tape. Also, (as I think was already mentioned) film is captured in progressive format, and the video on analog tapes is interlaced, so it is captured in interlaced format
    Last edited by usually_quiet; 5th Jan 2017 at 02:32.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  6. Member
    Join Date
    Feb 2013
    Location
    United States
    Search PM
    But what about the idea that if they were shot on film and then telecined to videotape, that tape would somehow be higher resolution than if it was just shot on videotape? Is that just because the 24 frames per second from the film would result in a number of fields matching up for a true frame when broadcast as the telecined 60 fields per second (I know they also create 12 fake fields somehow in addition to the 48 half frames from the film), as opposed to the 60 fields being 1/60th of a second apart and never matching up into a true frame when it was shot on video?
    Last edited by 90sTV; 5th Jan 2017 at 04:34.
    Quote Quote  
  7. Member
    Join Date
    Sep 2012
    Location
    Australia
    Search Comp PM
    Depends on what you call resolution.

    If the mechanisms in the video camera are inferior to the components in the telecine device then the filmed video will look better, but that has nothing to do with resolution.
    Quote Quote  
  8. Originally Posted by awgie View Post
    The word "resolution" is a misnomer. Since film is a photographic medium, it does not have a "resolution", in the sense of pixels. It may have different densities of the emulsion used in the film, and may have several different layers of emulsions for greater colour clarity. But it only has resolution (pixels) once it has been converted into a digital signal and stored on some form of electronic media, such as videotape.
    Film and other optical devices (like telescopes, microscopes, etc.) have a resolution -- the limit of the film's (or other devices') ability to resolve detail. And that term was in use many years before computer graphics were invented. It's in the digital realm where resolution is misused to mean the frame size of a graphic image. Which is related to resolution, but it's not truly resolution.
    Quote Quote  
  9. Member awgie's Avatar
    Join Date
    Sep 2008
    Location
    Lanarkshire, Scotland
    Search PM
    Originally Posted by jagabo View Post
    Originally Posted by awgie View Post
    The word "resolution" is a misnomer. Since film is a photographic medium, it does not have a "resolution", in the sense of pixels. It may have different densities of the emulsion used in the film, and may have several different layers of emulsions for greater colour clarity. But it only has resolution (pixels) once it has been converted into a digital signal and stored on some form of electronic media, such as videotape.
    Film and other optical devices (like telescopes, microscopes, etc.) have a resolution -- the limit of the film's (or other devices') ability to resolve detail. And that term was in use many years before computer graphics were invented. It's in the digital realm where resolution is misused to mean the frame size of a graphic image. Which is related to resolution, but it's not truly resolution.
    That's what I was trying to say. Maybe I would have been clearer to simply say that both digital and film photography use the term "resolution", but the word has different meanings in each.
    Do or do not. There is no "try." - Yoda
    Quote Quote  
  10. Member
    Join Date
    Sep 2012
    Location
    Australia
    Search Comp PM
    I think we've already debated this once before, it's just the word "resolution" that's throwing everything.

    I believe the conclusion went something like:

    Film camera usage have been in constant development in the cinema world for over a century, and the techniques and equipment for their use have been far and away more advanced than that of their video brethren. Therefore for the most part filmed sequences are visually superior to video recorded sequences and digital video camera development has only recently begun to catch up...

    Or something to that effect...
    Quote Quote  
  11. Member
    Join Date
    Aug 2010
    Location
    San Francisco, California
    Search PM
    Originally Posted by johnmeyer View Post
    Shooting TV shows on film was very common in the early days of television. They were often shot on 35mm movie film because Hollywood had the infrastructure to shoot, develop, and edit this film.
    More often shot on 16 mm to save money and because they knew television would not benefit from the larger film.
    Quote Quote  
  12. Originally Posted by 90sTV View Post
    But what about the idea that if they were shot on film and then telecined to videotape, that tape would somehow be higher resolution than if it was just shot on videotape? Is that just because the 24 frames per second from the film would result in a number of fields matching up for a true frame when broadcast as the telecined 60 fields per second (I know they also create 12 fake fields somehow in addition to the 48 half frames from the film), as opposed to the 60 fields being 1/60th of a second apart and never matching up into a true frame when it was shot on video?
    To add what ndjamena already said in response to this: it all depends on the camera. For instance, my first video camera used a Saticon tube and even when watching a live feed from that camera on my Sony TV set (thus removing the VHS tape recording from the equation), it looked pretty awful compared to watching live TV, like a football game (thus also removing videotape from the mix).

    So, that broadcast camera was capable of producing far more detail than my simple consumer camera. However, even when a broadcast video camera was the source, it was not capable of producing a higher resolution then the 525 lines of the NTSC broadcast standard because the mechanism of the standard -- 525 scan lines, and a 4.5 MHz analog bandwidth for the luma and chroma -- constrained how much detail could be resolved. Resolving detail, while not the only attribute that leads to a sharper, better picture, is certainly the main spec.

    Perhaps some of the confusion is that in today's world, especially the past few years, many of the broadcast cameras used for sports events and TV shows have the ability to shoot in 4K (or beyond), and at higher frame rates, even though the result is only sent to your home as 1080i or 720p (I think 1080p is available via cable & satellite, but I don't know if it is sent OTA). You see this all the time in sporting events where they can zoom in on something in replay, and it doesn't seem to get reduced in resolution. They can also slow down the playback because the original was recorded at far higher than 30 frames per second.

    However, back when all we had was analog SD TV, none of that was the case.

    The simplest way to think about this is to imagine a good old-fashioned TV test card (we saw a lot of these on the air back in the 1950s).



    If you shot this with a film camera, you would be able to resolve far more lines as the lines in the chart get closer together (towards the center of the "bullseye") than you could resolve if you shot the same chart with a broadcast NTSC 525-line analog TV camera.
    Quote Quote  
  13. Originally Posted by 90sTV View Post
    But what about the idea that if they were shot on film and then telecined to videotape, that tape would somehow be higher resolution than if it was just shot on videotape?
    This appears to be what you're actually asking, instead of the more general way you presented it in the thread title. The answer depends on the date the material was shot, or more precisely what gear it was shot with. Prior to approx 1999, well-made film, properly telecined (or these days, digitized perfectly) would usually look better than TV material produced directly on analog, standard-def tape. Partly this was due to the more mature skillset and hardware available for film, partly due to inherent limitations of standard-def video cameras and tape.

    Early black-and-white videotape had a distinct, absolutely horrible look to it. Resolution was dismal, everything was soft, and highlights were always blown out in a weird reverse-negative tone. The perfect demo example is "The Twilight Zone" series of the early 1960s. Of the five seasons, four were shot on film, one was shot directly to videotape. To this day, the filmed seasons rank among the most stunning examples of picture quality ever done for television: the episodes look almost 3D, with a pristine, jaw-dropping tonal depth and sharpness. The season shot directly on videotape looks dismal, like a 1950s soap opera: no visual pizzaz whatsoever. So in this SPECIFIC example, a telecine tape made from the film episodes looks dramatically better than the episodes shot directly to tape.

    Moving forward to the 1970s and 1980s, the dichotomy largely held true. That era of TV shows shot on film can be restored and digitally scanned to at least 1080 HDTV level- check the commercial disc releases of things like Ghost Story-Circle Of Fear, Star Trek, Charlies Angels, Bewitched, Bob Newhart, The Odd Couple, etc. The transfers are crisp, detailed, with good tonal depth. OTOH, legendary shows that were shot direct to video, like All In The Family or Soap, look atrocious today (even restored): total mush, blasted with klieg lights, no realistic tonal variation at all. In time, the analog broadcast cameras and VTRs got better, and by the late 1980s direct-tape productions were much improved. But telecined film shows generally still had the edge, until digital and HDTV studio gear became widely available.

    Today I'd say the two formats run neck-and-neck in most cases. Resolution-wise, scanned film and direct HDTV video productions appear comparable unless the HDTV production exploits some of its specific advantages with certain material (or the film project is hampered by its limitations with certain material). Going forward, film is essentially on life support anyway and will soon become a non-factor: aside from a handful of influential directors with the clout to resist, the Hollywood studios want film dead and out of their hair. Vintage films will be preserved as sources for new digital format transfers, but new film productions will likely end by 2020 or so.
    Last edited by orsetto; 5th Jan 2017 at 18:49.
    Quote Quote  
  14. Originally Posted by JVRaines View Post
    Originally Posted by johnmeyer View Post
    Shooting TV shows on film was very common in the early days of television. They were often shot on 35mm movie film because Hollywood had the infrastructure to shoot, develop, and edit this film.
    More often shot on 16 mm to save money and because they knew television would not benefit from the larger film.
    I couldn't find a list of film formats used for each old TV show. "I Love Lucy" was definitely shot in 35mm, as were many other shows that followed Desilu's lead in how to shoot a live TV show for later broadcast.

    Most articles I just scanned seemed to suggest that 35mm was most often used, for the reasons I already listed. 16mm was used as a backup and for distribution during syndication (so what you actually saw when your local affiliate re-aired these shows in strip format might very well have been from a 16mm print).

    This list, while not authoritative, and not covering older shows, lists only two shows filmed in 16mm out of twenty-six shows listed:

    Recent Television Shows Shot on Film

    Twenty-four out of twenty-six were shot in 35mm.

    It is interesting to note that some of these shows were shot well after HD video acquisition was widely available. Film still has attributes (guaranteed longevity, for one) that still make it attractive. Also, while some always argue (and have done so in this thread) that 24p is a horrible cadence, it most definitely imparts a "once-removed" feel (read Marshall McLLuhan's works) that is artistically very desirable for many projects.
    Quote Quote  
  15. Originally Posted by JVRaines View Post
    More often shot on 16 mm to save money and because they knew television would not benefit from the larger film.
    No. Usually 35mm. Once television was being recorded on video (All In The Family era) 16mm was sometimes used for inserts or backgrounds for titles. But most TV programs originating on film were 35mm.
    Quote Quote  
  16. I know of some old BBC series that were shot on 16mm film -- usually a mix of film and video. Doctor Who, Survivors, and Dad's Army, for example.
    Quote Quote  
  17. Originally Posted by orsetto View Post
    Early black-and-white videotape had a distinct, absolutely horrible look to it. Resolution was dismal, everything was soft, and highlights were always blown out in a weird reverse-negative tone. The perfect demo example is "The Twilight Zone" series of the early 1960s. Of the five seasons, four were shot on film, one was shot directly to videotape. To this day, the filmed seasons rank among the most stunning examples of picture quality ever done for television: the episodes look almost 3D, with a pristine, jaw-dropping tonal depth and sharpness. The season shot directly on videotape looks dismal, like a 1950s soap opera: no visual pizzaz whatsoever. So in this SPECIFIC example, a telecine tape made from the film episodes looks dramatically better than the episodes shot directly to tape.
    I've done a lot of work with Kinescopes and also, on rare occasions, material from 1960s videotape.

    What you describe sounds more like a Kinescope. I tried to find information on whether any of the original Quadruplex videotapes for the six TZ episodes that were shot on videotape have actually survived. I could not find an answer, but my own experience in doing restorations from this era is that they may not have survived because networks were notorious for re-using the expensive, two-inch-wide gigantic reels of tape, not only because the tape itself was expensive, but storage space cost a lot, and kept having to expand as more and more shows were produced.

    Having been fortunate enough to do restoration work with a first-generation dub from a 1968 Quadruplex tape, I can tell you that those "super-hot" highlights are actually NOT normal for early videotape. Broadcast engineers were extraordinarily skilled and well-trained back then, and obsessed about everything on their vectorscopes. Hot levels simply were not allowed.

    I've also done quite a bit of work on restoring Kinescopes and figuring out how to make them look again like the original video, and one of the first steps in a multi-part process is to create a rather "strong" gamma curve that tries to eliminate that blown-out look. This is then followed by steps to remove tell-tale gate weave, dirt, film grain and other artifacts that make it obvious that film was involved. The final step is the one that everyone who hasn't actually done this work thinks of first, namely the 24p --> 60i conversion using motion estimation. This is actually the easiest step in the process.

    I just watched on Netflix about five minutes of one of the TZ videotaped episodes, "The Lateness of the Hour," and I actually am not sure whether it is straight from the videotape or not. Whoever worked on it was brilliant, and the results are spectacular.
    Quote Quote  
  18. Originally Posted by johnmeyer View Post
    What you describe sounds more like a Kinescope.
    SMH now in embarrassment- you are of course correct that the Kinescope copies magnified many of the perceived early videotape ills. In focusing on the telecine question, I momentarily forgot that ass-backwards process of preserving videotape-originated shows as Kinescope dupes. What a bucket of horrors that was.

    Nonetheless, within the limits of the TZ universe, I still feel the videotaped episodes lose quite a bit of their polish vs the filmed eps (Serling himself agreed, and lobbied mightily for CBS to switch back). I just took a quick look at my remastered discs: despite the improvements in these versions, the taped episodes still look like poor relations to the filmed episodes. It isn't a knock against the technicians, but a limit of the technology at that time. Using "Lateness Of The Hour" as an example, the flaws have been remarkably tamed from earlier syndication prints, but there is still an unintended, disconcerting feel to it (some sort of temporal motion weirdness unique to the six taped episodes). Compared to filmed eps like "After Hours" and "Perchance To Dream", "Lateness Of The Hour" looks flat, unnatural and pretty bad. It isn't so much the Ampex VTRs but the Orthicon camera tubes causing the negative highlights, which got exacerbated by the Kinescope duping. I'll grant you, the remastered, cleaned DVD versions of these videotaped eps look miles better than the 16mm Kinescope prints that circulated for 30 years: those had almost no detail and the black halos were exaggerated.

    Luckily the writing, acting and direction were able to overcome the technical drawbacks, at least with some of these eps. "22" is one of the most memorable TZs of all time, what with the scary morgue nurse repeatedly sneering "room for one more, honey" at the terrified Brooklyn stripper. And "Lateness Of The Hour" has one of the few preserved performances of Inger Stevens, whose life was cut way too short.

    The ensuing color videotape-camera broadcast systems were surprisingly variable, right up thru the 1980s. Consistent parity between film and (SD) video origination was a long time coming: perhaps not until the '90s.
    Last edited by orsetto; 5th Jan 2017 at 19:17.
    Quote Quote  
  19. Member
    Join Date
    Feb 2013
    Location
    United States
    Search PM
    Well, the reason I ask is I had read that low budget shows were shot on videotape (such as soap operas), and supposedly they made heavy use of close ups to compensate for lower resolution than if the show was shot on film. But that didn't make sense to me, since I would assume that they would have to do the same on shows shot on film, like Seinfeld or cartoons, since they would all end up on videotape anyway and I presume take on the resolution of the videotape.

    Would it be correct to say that "lower resolution" wasn't really the issue, but rather just overall picture quality as a result of other aspects that weren't as good when something was shot on video, as opposed to telecined?
    Quote Quote  
  20. Originally Posted by 90sTV View Post
    Would it be correct to say that "lower resolution" wasn't really the issue, but rather just overall picture quality as a result of other aspects that weren't as good when something was shot on video, as opposed to telecined?
    Yes and no. At the time, film origination absolutely had higher apparent resolution than videotape origination. But that advantage could easily be nullified by poor telecine or other broadcast errors: in such cases, the main advantage of film was shooting and production flexibility (staging, locations, lighting, editing). Video origination often entailed a stagebound feel and flat, dull lighting. During those years, it could often be hard to tell the difference on an average 19" CRT TV with average to poor reception. Today, with our revealing, merciless flat screens and digital remastering, scanned film regains its "resolution advantage" against most video-originated productions from the same era. Heck, even the "non-remastered" initial dvd release of (filmed) "Star Trek" looks amazeballs compared to any (videotaped) Norman Lear sitcom.
    Last edited by orsetto; 5th Jan 2017 at 19:15.
    Quote Quote  
  21. Member
    Join Date
    Feb 2013
    Location
    United States
    Search PM
    Wouldn't telecined SD content look better on a CRT, though, since it doesn't have to scale?
    Quote Quote  
  22. there is still an unintended, disconcerting feel to it
    Yes, this is entirely due to 60 temporal events per second rather than 24. With or without telecine (i.e., even if you watch the filmed TZ episodes using IVTC'd source material), the difference in how it "feels" is enormous. I mentioned, in a previous post, Marshall McLuhan whose most famous books was "The Medium Is The Message." He wrote extensively, beyond just that book, about how our perceptions of content are influenced by the medium on which it is delivered. Like most intellectuals, 95% of what he wrote was BS, and he also was actually not that articulate. However, he did recognize that there was some major differences between movies and TV, labeling them hot and cool respectively.

    The other reason those six TZ episodes feel so different is that you couldn't cut videotape with a scissors, and editing it by copying back and forth between multiple decks was extremely difficult, and resulted in visual degradation. So, most productions were done live, in a studio, and cuts were done by switching between cameras live, like a sports event. This meant that all the videotaped TZ episodes feel like you are watching them on a stage which, in fact, you are. In that sense they are very similar to much more recent productions like Carrie Underwood's "Sound of Music" which was basically a stage production filmed live.

    And "Lateness Of The Hour" has one of the few preserved performances of Inger Stevens, whose life was cut way too short.
    She was one messed up lady, AFAICT. Committed suicide.

    I did enjoy watching her in "The Farmer's Daughter."

    Well, the reason I ask is I had read that low budget shows were shot on videotape (such as soap operas), and supposedly they made heavy use of close ups to compensate for lower resolution than if the show was shot on film. But that didn't make sense to me, since I would assume that they would have to do the same on shows shot on film, like Seinfeld or cartoons, since they would all end up on videotape anyway and I presume take on the resolution of the videotape.
    All TV shows, up until HD, emphasized close ups. It really didn't matter whether it was broadcast live; recorded to videotape and then broadcast; or recorded on 35mm film and then broadcast. The main issue was not the manner of acquisition, but the resolution (as already discussed extensively in previous posts) coupled with screen size. You have to remember that a 27" "console" TV was as big as you could buy, until big-screen projection TVs started to show up in the late 1970s, eventually taking hold a decade later. I think the biggest direct-view CRT was not much bigger than 32", but I can't remember right now.

    The sweet spot for inexpensive TVs was the ubiquitous 19" TV, so that's what most people watched, and that's what the TV producers had to produce for. I still have one, along with a 13", 9" and 31". I watch TV on them every day (usually in the background). Believe me, even with modern HD video, downconverted to SD NTSC, without closeups, it is hard to tell what the heck is going on. (The only reason I still have them is that they still work, and are perfectly adequate for business news, etc.).

    The bottom line on all of this is that original assumption and premise of the original post is probably not correct. As I've already stated, I've seen quite a few really good transfers from some of the Quadruplex tapes that did mange to survive from the 1960s, and they can be really, really good. A good example, accessible to everyone, is the short-lived 1969 TV show "The Music Scene." I have the DVDs and they are remarkable. This YouTube clip does a really lousy job preserving detail (the DVD is much sharper), but it does a very good job showing how the brightness levels were maintained just fine (i.e., no hot, glowing highlights):

    The Music Scene

    [edit]Don't be put off by the temporary highlight issues in the opening 15 seconds (like the glare off Crosby's guitar). They are the result of taking a long shot of a scene lit by spots. When the camera dollies in, the exposure is perfect (note Young's near-white shirt, and how nothing is blown out). I wish there were some way I could show you the DVD version, because it is much sharper.

    I agree with everything in Orsetto's #20 post.

    P.S. If you want to see some really great video from the 1960s, take a look at some of the BBC concerts from that era. They are absolutely amazing.
    Last edited by johnmeyer; 5th Jan 2017 at 21:04. Reason: Edited video link; corrected typos
    Quote Quote  
  23. Member
    Join Date
    Sep 2012
    Location
    Australia
    Search Comp PM
    To me the video portions of Doctor Who, Yes Minister and Fawlty towers look better (crisper) than the film portions, even after being deinterlaced. Although, since the video portions are all shot inside within controlled environments while the film portions are all shot outside, it may not be a fair comparison.
    Quote Quote  
  24. Originally Posted by ndjamena View Post
    To me the video portions of Doctor Who, Yes Minister and Fawlty towers look better (crisper) than the film portions
    This was very common with shows that were primarily done on tape, with occasional filmed inserts. For whatever reasons, the filmed inserts were jarringly mismatched to the video portions. I think it was more a question of those inserts being shot as throwaway segments with little to no effort at slick production (esp Fawlty Towers: the exteriors seemed shot on a spring-wound Bell & Howell Standard 8mm). Also, the inserts were usually shots that would have been difficult or impossible with the video tech of the time (night scenes, rural locations). So the mismatch was that much more noticeable. This continued well into the 1980s.
    Quote Quote  
  25. Originally Posted by ndjamena View Post
    To me the video portions of Doctor Who, Yes Minister and Fawlty towers look better (crisper) than the film portions, even after being deinterlaced. Although, since the video portions are all shot inside within controlled environments while the film portions are all shot outside, it may not be a fair comparison.
    Monty Python used to mix film (the exterior nonsense) with video (the interior nonsense). They sure looked different, but I never thought either of them looked "better" than the other. By that I mean that one did not seem to me to be sharper. I have the complete DVD set, so I should pop a disc in the player and take a look.

    Come to think of it, I never did finish watching every episode, so maybe this is a good excuse to finish it off. I also still have the last two discs of the complete Dianna Rigg Avenger episodes. That was another series that alternated between video and film. The early episodes with Honor Blackman were done on video, and I did not like them. The constraints of the set, coupled with the 50i video completely changed (for the worse) the "feel." The later episodes with Rigg (and then Thorson) were filmed and therefore free to go anywhere, at any time of day, often with unusual lighting conditions.
    Quote Quote  
  26. Member
    Join Date
    Feb 2013
    Location
    United States
    Search PM
    Wouldn't telecined SD content look better on a CRT, though, since it doesn't have to scale?
    Quote Quote  
  27. Originally Posted by 90sTV View Post
    Wouldn't telecined SD content look better on a CRT, though, since it doesn't have to scale?
    Yes. Standard definition pretty much always looks better on a good CRT display than on LCD HDTVs. The newest OLED screens come closer to CRT for SD compatibility, but they're still rather pricey. The remastered TZ discs I was discussing with johnmeyer look extraordinarily beautiful on a CRT Sony Trinitron, but merely "good" on a Sony Bravia LCD.

    Originally Posted by johnmeyer View Post
    Monty Python used to mix film (the exterior nonsense) with video (the interior nonsense). They sure looked different, but I never thought either of them looked "better" than the other.
    The Python production team was quite good at making the look and feel of their video and film segments mesh fairly well. Perhaps because there were so many short scenes of characters moving between the street and their homes, so keeping a consistent style would bolster the gags. Hope you get a chance to finish viewing your disc set: some legendary comedy gems in there. Along with the dvds, I still have my VHS recordings of the last full airing on PBS during the late '80s (which I should probably discard: other than to see some old fundraising interruptions, they're completely redundant to the BBC dvd release).
    Last edited by orsetto; 6th Jan 2017 at 01:55.
    Quote Quote  
  28. That's why I and a lot of other film lovers respect film better. That's how Kodak was able to save their film manufacturing division & it's making a Nice profit,again.

    https://www.imdb.com/list/ls064832514/
    Last edited by P888; 18th May 2018 at 22:32.
    Quote Quote  
  29. Member netmask56's Avatar
    Join Date
    Sep 2005
    Location
    Sydney, Australia
    Search Comp PM
    Whether 16mm or 35mm was used depended a lot on the local market - in Australia it was very rare for TV productions to use 35mm due to it's relative cost. The major exception was when a drama was being shot 35mm was the choice especially if oversea sales were possible. As far as local productions 16mm was used as it was for documentaries and current affairs programs not shot in the studio. Post audio production prior to digital was a lot easier using established film techniques.
    SONY 75" Full array 200Hz LED TV, Yamaha A1070 amp, Zidoo UHD3000, BeyonWiz PVR V2 (Enigma2 clone), Chromecast, Windows 11 Professional, QNAP NAS TS851
    Quote Quote  
  30. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    The difference is not just about native (or converted) resolution, but also about contrast.

    Film has, until quite recently, been more able to capture (and "compress") the high dynamic range of nature and more faithfully represent those differences in its lesser capable-than-reality medium than video sensors could.
    And even the squeezing the >100000:1 differences in brightness down to a range of 1-4000:1 and then to video's ~100:1 range looked better in those days than trying to squeeze all of the original down directly to 100:1.
    Of course, both film and electronic sensors have greatly improved, but electronic sensors have improved many times over.
    It it well known that perception of resolution sharpness is tied to local contrast, and this intermediate form of dynamic range compression is good at retaining local contrast.

    Scott
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!