VideoHelp Forum

+ Reply to Thread
Page 2 of 2
FirstFirst 1 2
Results 31 to 57 of 57
Thread
  1. Member
    Join Date
    Nov 2021
    Location
    Melbourne, Victoria, Australia
    Search Comp PM
    Originally Posted by ndjamena View Post
    film is 24 fps. Video is either 60 or 50 fields per second. When converted to video it's necessary for a single film frame to occupy at least two fields, which means AN ENTIRE frame exists in the stream. Video however, is constructed from fields, each comprising of only the even or odd lines of any particular time frame. HALF of the resolution of a single time frame is lost in a video stream and no deinterlacing scheme can fully recover that, whereas a full frame can be recovered from a telecined stream.
    Film can be any framerate, it would be 25 fps when used for television that is 50 fields per second. When shot on film at 25 fps and transferred to 50i video (say it's 570i50) the vertical resolution of each time frame of video is twice what it is when shot on tape at the same line rate - number of lines are fields per second, also interlaced.
    Originally Posted by awgie View Post
    The word "resolution" is a misnomer. Since film is a photographic medium, it does not have a "resolution", in the sense of pixels. It may have different densities of the emulsion used in the film, and may have several different layers of emulsions for greater colour clarity. But it only has resolution (pixels) once it has been converted into a digital signal and stored on some form of electronic media, such as videotape.
    No it isn't because film does have grain but the grains are or random size. By the way, the level of detail depends on the film format and film speed. Slower films give a better definition for a given film format.
    Originally Posted by johnmeyer View Post
    Shooting TV shows on film was very common in the early days of television. They were often shot on 35mm movie film because Hollywood had the infrastructure to shoot, develop, and edit this film. When the show was ready to air, the film was put onto a film chain which did the conversion to video. In the early days this was done in real time because videotape had not yet been invented.
    It was very common until the arrival of high definition video, so as recently as the early 2000s.
    Originally Posted by johnmeyer View Post
    Now that we have ubiquitous HD video, these old films can be scanned at HD or 4K resolution and, when then processed with the proper software, can look like they were shot with modern equipment. The recent broadcast of two "Dick Van Dyke Show" episodes from the early 1960s shows what can be done. These were also colorized (which I did not like at all), but if you saw them, you will appreciate how sharp and crisp they looked compared to what they looked like before. I saw many of them when they were originally broadcast, and it is quite a thrill to see all the new detail, and to see all the old film artifacts (dirt and grain) removed.
    Since the U.S already had color T.V at the time of the Dick Van Dyke show, why wasn't it shot on color film? Color film was also available quite some time before Europe got color T.V.
    Originally Posted by usually_quiet View Post
    The frames in the type of film used for theatrical movies and TV is large enough and fine-grained enough to allow capturing digital picture information at a higher resolution than is possible for video tape. Also, (as I think was already mentioned) film is captured in progressive format, and the video on analog tapes is interlaced, so it is captured in interlaced format
    Film frames are not scanned at all. But any electrical representation of a picture will at least be divided into lines. I do believe the progressive format is more film like.

    Analog video does exist in quite high definition, such as the Sony HDVS system (1035 scan lines but still interlaced) but this has always been quite rare.

    The thing is that 35mm movie film is said have a resolution similar to 4k. I don't know if it is faster or slower 35mm films that have this resolution. Anyway, I have never heard of any 4k analog video.
    Originally Posted by jagabo View Post
    Film and other optical devices (like telescopes, microscopes, etc.) have a resolution -- the limit of the film's (or other devices') ability to resolve detail. And that term was in use many years before computer graphics were invented. It's in the digital realm where resolution is misused to mean the frame size of a graphic image. Which is related to resolution, but it's not truly resolution.
    Any picture sent by wire (such as if broadcast over the air or electrically recorded) will necessarily have a defined resolution in numerical terms, in one or both dimensions. Film's resolution is only an average number of grains per picture height and picture width.
    Originally Posted by johnmeyer View Post
    So, that broadcast camera was capable of producing far more detail than my simple consumer camera. However, even when a broadcast video camera was the source, it was not capable of producing a higher resolution then the 525 lines of the NTSC broadcast standard because the mechanism of the standard -- 525 scan lines, and a 4.5 MHz analog bandwidth for the luma and chroma -- constrained how much detail could be resolved. Resolving detail, while not the only attribute that leads to a sharper, better picture, is certainly the main spec.
    Not only that but it also doesn't lend itself to conversion to any other technical standard such as a higher or lower number of lines.
    Last edited by Xanthipos; 25th Nov 2021 at 08:31. Reason: question mark
    Quote Quote  
  2. Since the U.S already had color T.V at the time of the Dick Van Dyke show, why wasn't it shot on color film. Color film was also available quite some time before Europe got color T.V.
    Because b&w 16mm was cheaper
    Quote Quote  
  3. Member
    Join Date
    Nov 2021
    Location
    Melbourne, Victoria, Australia
    Search Comp PM
    Seriously, why would black and white film be used for television in a country that already had T.V in color?
    Quote Quote  
  4. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    Well read the Wiki on the original series. Nothing to do with the availability, or not, of color-tv


    An additional $7,000 per episode was hardly cost-effective.


    And since I am here, your initial remark, having 'dug up' an old thread, about film being shot at different speeds is sheer nonsense.
    Quote Quote  
  5. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Seriously, because it was cheaper to produce that way. Could be the difference between something being profitable, and not profitable.
    And you have to remember, just because TV broadcasting in US started to use color in the mid 60s doesn't mean it could be seen that way. Consumer uptake of color TVs didn't overtake B/W ones until mid70s. The (shortsighted) thinking was, why spend roughly 3x the price to make something in color when almost nobody would be watching it that way?
    Don't forget, RERUNS, syndication and home video recording did not exist back then, so once something was broadcast, they thought it was done, ready to archive or throw away.

    One thing that none of us mentioned back then about the difference was in the Aspect ratio's affect.
    A 4:3 ratio is, especially when coupled with smaller size & resolution, much more conducive to close up shots, talking heads being the most common type. Widescreen opened up the canvas to more expansive shots, and that was originally one of its marketing points. Heck, you can even see a statistical difference in shot type in film projected in cinemas with 4:3 vs widescreen of various ratios.

    Btw, given the age of threads like this it is usually more appropriate to just start a new thread.


    Scott
    Last edited by Cornucopia; 25th Nov 2021 at 09:08.
    Quote Quote  
  6. Member awgie's Avatar
    Join Date
    Sep 2008
    Location
    Lanarkshire, Scotland
    Search PM
    Originally Posted by Xanthipos View Post
    Seriously, why would black and white film be used for television in a country that already had T.V in color?
    Seriously, while the US technically "already had T.V. in color", the actual color television sets were expensive, so only the very rich owned them. And the only things that were broadcast in color were special programs. It wasn't until around mid-1965 - just a few months before the Dick Van Dyke Show ended - that the networks started airing more programs in color, because color TVs started to become more commonplace.

    So why would a studio spend a lot of extra money to produce a show in color, when it was only going to be broadcast in black & white anyway?
    Do or do not. There is no "try." - Yoda
    Quote Quote  
  7. Member awgie's Avatar
    Join Date
    Sep 2008
    Location
    Lanarkshire, Scotland
    Search PM
    Originally Posted by Cornucopia View Post
    Btw, given the age of threads like this it is usually more appropriate to just start a new thread.


    Scott
    Unless you want to quote a half-dozen old comments, and "correct" them. On the same day that you joined the forums.
    Do or do not. There is no "try." - Yoda
    Quote Quote  
  8. Member
    Join Date
    Nov 2021
    Location
    Melbourne, Victoria, Australia
    Search Comp PM
    Did color film really cost three times as much as black and white at that time?

    Originally Posted by Cornucopia View Post
    Don't forget, RERUNS, syndication and home video recording did not exist back then, so once something was broadcast, they thought it was done, ready to archive or throw away.
    Reruns didn't exist? I know home video recording first appeared in the 1970s but weren't there reruns before then.

    Originally Posted by awgie View Post
    So why would a studio spend a lot of extra money to produce a show in color, when it was only going to be broadcast in black & white anyway?
    Well, if the film footage was to be archived once it was shown, a reason to do it in color was so it could be remastered, and maybe rebroadcast in color at a later time.

    If making it in black and white for television that was mostly in black and white, why even use film that holds a higher level of detail then could possibly be displayed on the television on which all the audience was watching it?

    The big question here is, if exceeding the specs of the television of the time (which has 480 lines of resolution), and color T.V already existed at the time, then why not use color film?

    The producers of shows like Mister Ed and the Dick Van Dyke decided to use black and white film, and it may have been their loss in the long term.
    While we can scan the original film negatives at a higher resolution (say 2k) and watch the show in higher definition than on the tube sets of the time it was broadcast, it will still be in black and white unless colorized, which someone here said he didn't like.
    Last edited by Xanthipos; 26th Nov 2021 at 00:26. Reason: More to add
    Quote Quote  
  9. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    1. It wasn't just the cost of the film, but the cost of the processing (due to economies of scale the reverse is true now). And in the 50s and early 60s, there were basically only 2 color negative film types (as beautiful Kodachrome was a reversal/slide format): Technicolor and Eastmancolor. Technicolor was much more expensive as it had to be shot with 3 camera bodies cause of 3 separate strips, timed and filtered to exacting specs, equipment could only be rented out, and it required hiring a Technicolor consultant on set.
    Eastmancolor used only one strip, but until mid or late 60s was sorely lacking in contrast density and didn't really have slow/fine grain varieties.
    So, yes it would have been more expensive by a big margin, and no, it wouldn't have been as sharp as doing it in black and white.

    2. Reruns only started when Lucille Ball & Desi Arnaz got the idea rolling in the late 50s, and it didn't pick up steam until the late 60s/early 70s. (Btw, a good read is how Lucy is fully responsible for the establishment of both the Star Trek and Mission Impossible franchises, though it kind of bankrupted their company Desilu).

    3. It wasn't until the invention of timecode in 67 and the development of electronic A/B roll editing in the early 70s that it became common and inexpensive to edit studio tv productions electronically (in post). Prior to this, one could attempt to do physical splicing of videotape (yes, they did do this, but with great difficulty being diagonal and all), and counting control track pulses, but that was prone to much error, which becomes costly. Or one could do a live, on-set edit, but have no backup. Editing film on a Steenbeck was always the cheapest, quickest and most accurate way to edit up until the advent of non-linear editing in the 90s.

    Yes, they were shortsighted. The long tail of home video didn't exist yet, so they had nothing else to compare it to.
    It was a business, and they were trying to do it quickly, accurately and cheaply, so BW film made sense for a long time.
    You are just seeing this through 20-20 hindsight, modern tech assumptions, and bittersweet nostalgia, sorry to say.


    Scott
    Quote Quote  
  10. Member Kakujitsu's Avatar
    Join Date
    Jun 2011
    Location
    United States
    Search Comp PM
    Very interesting read, alot of knowledge in these posts...

    https://www.youtube.com/watch?v=gVnaL_MbBGk
    Quote Quote  
  11. Member
    Join Date
    Jul 2007
    Location
    United States
    Search Comp PM
    Originally Posted by Cornucopia View Post

    2. Reruns only started when Lucille Ball & Desi Arnaz got the idea rolling in the late 50s, and it didn't pick up steam until the late 60s/early 70s. (Btw, a good read is how Lucy is fully responsible for the establishment of both the Star Trek and Mission Impossible franchises, though it kind of bankrupted their company Desilu).

    Scott
    There's a difference between reruns, which were common during the summer while you waited for the new TV season to start in September and syndication, which I believe I Love Lucy was the first to to be offered in 1967.

    Broadcast history
    I Love Lucy aired Mondays from 9:00 to 9:30 PM ET on CBS for its entire first run. Each year during its summer hiatus its timeslot was occupied by various summer replacement series. Beginning in April 1955 CBS added reruns from the show's early years to its early evening weekend schedule. This would be the first of several occasions when I Love Lucy reruns would become part of CBS's evening, prime time, and (later on) daytime schedules.[4]

    In fall 1967, CBS began offering the series in off-network syndication; as of August 2017, the reruns air on the Hallmark Channel and MeTV networks, and scores of television stations in the U.S. and around the world, including Fox's KTTV/KCOP in Los Angeles until December 31, 2018.


    Source: https://ilovelucyandricky.fandom.com/wiki/I_Love_Lucy

    Prior to this, TV shows were the property of the four, later three, networks, Dumont, ABC, CBS and NBC. And often broadcast live on the East Coast and transferred to film or videotape, which was very expensive into the 70's and often reused for rebroadcast for the West coast. In these pre-satellite days, here in Hawaii, we'd get TV shows a week or two later because the show had to be physically flown our local broadcasters.

    Desi Arnez had the foresight to own the rights to the show (and others later) as well as use film and three cameras to make a quality show that he could rerun on CBS as well as sell into syndication, which is the main reason I Love Lucy is still well know until today. As unthinkable in today's digital age, a great number of TV show films were destroyed for lack of storage space and videotapes (2" and later 1") were erased because of their high cost.

    At least, that is, after the formation of Desilu Productions. CBS wasn’t sure the country was ready for an interracial television sitcom about a fiery American redhead and a Cuban fellow. To quell the studio’s concerns, Ball and Arnaz formed their own company and became their own bosses, producing the I Love Lucy show on their own and selling it to CBS. They also refused to leave Los Angeles and move to New York, where almost all television shows of the time were shot. As it was, with New York in the Eastern time zone, shows were shot and broadcast live on the east coast (where there were more TV owners) and recorded on kinescopes for broadcast later in the west. Kinescopes were a terrible method of literally pointing a camera at a television as it played something, resulting in a shoddy, blurry picture. Ball and Arnaz didn’t want their show looking like that and expected the same quality picture be shown nationwide. The solution: shoot the series on 35mm film and play that film in all national networks, with the same quality. Not only did this decision up the ante for the production of television programming, but it meant all I Love Lucy episodes were recorded on a permanent, re-playable, high-quality medium. It made the rerun possible.


    Filming I Love Lucy with a studio audience

    Shooting on film cost a lot more, so Lucy and Desi cut their salaries to compensate, with the caveat that they would retain all rights to the show. CBS agreed, and the couple inked a deal that would change everything about television from that day forward. They just earned the ability to sell their series into syndication and make incredible money. (They would eventually re-sell the show’s syndication rights to CBS for a grand sum.)


    Source: https://the-take.com/read/how-did-ai-love-lucya-invent-the-rerun-and-syndication
    Quote Quote  
  12. Member
    Join Date
    Jul 2007
    Location
    United States
    Search Comp PM
    @Xanthipos

    A few things you have to understand.

    Color TV shows were a rarity until the 70's. We got a color TV in 1966 (23" !!!) and most of the shows were still in B/W. It was big thing for a TV show to be in color. Proudly announced the opening credits. Long before In Living Color was a TV show, NBC proudly used it to announce which shows were In Living Color. And not only were we a week or two behind the mainland in seeing the latest TV episode, but despite it being done in color, we often got the B/W version because if it was on film, that film was intended to be destroyed after broadcast rather than being flown back to the mainland.

    We had a local children's show, Checkers and Pogo, that was mostly live, but few, if any of these episodes are available because the it was filmed, the film would be cut into strips and given as gifts in The Goody Bag given to the children appearing on the show. Notice how few segments of the actual show, some later ones from videotape are shown in this retrospective: https://www.youtube.com/watch?v=5R6CWVJ6C4E.

    In the days before cable, broadcast TV shows in the U.S. strictly adhered to seasons, running roughly September to May. the three networks season started and ended a week or so within each other. The summer months were mostly reruns and short term summer replacement shows that were sometimes used as tests for what may become a regular season show. If you missed a show during it's initial showing, you hoped that it would be shown during the summer, though there was no guarantee.

    Unlike now, where you could watch just about anything at anytime, even if it's not On Demand. Daytime TV was primarily soap operas and gameshows during the day. Soap Opera's were notably done live or on videotape, giving rise to the term Soap Opera Effect used to describe the usually undesired flat, overly sharp, overly saturated look that videophiles hate. Again, because videotape was expensive and the shows weren't intended to be rerun, being an ongoing daily series, many Soap Opera episodes from the 60's aren't available.
    Quote Quote  
  13. Member
    Join Date
    Nov 2021
    Location
    Melbourne, Victoria, Australia
    Search Comp PM
    Originally Posted by lingyi View Post
    Color TV shows were a rarity until the 70's. We got a color TV in 1966 (23" !!!) and most of the shows were still in B/W. It was big thing for a TV show to be in color. Proudly announced the opening credits. Long before In Living Color was a TV show, NBC proudly used it to announce which shows were In Living Color. And not only were we a week or two behind the mainland in seeing the latest TV episode, but despite it being done in color, we often got the B/W version because if it was on film, that film was intended to be destroyed after broadcast rather than being flown back to the mainland.
    Why would the film be destroyed after broadcast rather than being archived? Film cannot be reused and even 16mm film may have a higher level of detail than even 570 line video, I think it depends on film speed. If the film is to be destroyed after broadcast, not archived, then the show might as well be shot on a tape, more below.

    Originally Posted by lingyi View Post
    Unlike now, where you could watch just about anything at anytime, even if it's not On Demand. Daytime TV was primarily soap operas and gameshows during the day. Soap Opera's were notably done live or on videotape, giving rise to the term Soap Opera Effect used to describe the usually undesired flat, overly sharp, overly saturated look that videophiles hate. Again, because videotape was expensive and the shows weren't intended to be rerun, being an ongoing daily series, many Soap Opera episodes from the 60's aren't available.
    Well, magnetic tape is reusable and that's what would have been done. There some aspect to the soap opera effect that applies to anything shot on tape. If something is shot on film at even 30 frames per second (as opposed to any less), it will differ in motion from anything shot on tape, even when the filmed footage is viewed on television by running the film through a telecine, this includes the filmed footage being transferred to and seen on tape.

    Regarding film, there was also Agfacolor. Technicolor (which used three black and white negative strips) was not only more expensive but also needed brighter lighting, and declined in use with the arrival of color film, that is a single strip color process.
    Quote Quote  
  14. Member
    Join Date
    Jul 2007
    Location
    United States
    Search Comp PM
    All films and videotapes take storage space and all storage space costs money. Videotapes weren't cassettes, they were 2' tape on large reels, close to the size of a film reel. in addition, film deteriorates even when kept under proper storage conditions and early film was highly flammable. It's not only TV shows that were destroyed, but numerous theatrical films, sometimes for the small amount of silver nitrate that could be recovered from them. Read this for more - https://en.wikipedia.org/wiki/Lost_film
    Quote Quote  
  15. Originally Posted by ndjamena View Post
    To me the video portions of Doctor Who, Yes Minister and Fawlty towers look better (crisper) than the film portions, even after being deinterlaced. Although, since the video portions are all shot inside within controlled environments while the film portions are all shot outside, it may not be a fair comparison.
    I agree if you are talking about the original edits as broadcast from 2" and 1" tape, where the 16mm location film is largely compromised by the telecine technology of the day and has a very distinctive '1970s/80s UK 16mm telecine' look.

    However have you seen the Blu-ray releases of the 1970s and 1980s Classic Doctor Who episodes where they have been able to re-telecine the original location film sequences in HD? That changes it entirely - as the 16mm stuff looks brilliant having been re-transferred, graded and matched better to the studio stuff.

    The same is true of other 16mm film series made in the 80s like Miss Marple that have been retransferred (the BBC recently rebroadcast some of the Miss Marple series from the HD transfers and they looked fantastic)

    I agree with others that the distinctive difference in UK location 16mm film vs UK studio video shot content isn't totally straightforward. The BBC had 4-tube Plumbicon EMI 2001 cameras that were REALLY sharp for that era of production, and in studio were able to pour light onto scenes (whilst somehow not managing to make them look horrifically over-lit like some ITV soaps of the era), whereas film exteriors were often more crudely lit, and the grading and telecine tech of the day didn't deliver the same crisp and colour fun pictures as studios.

    As for the original question - lots of US series (Dynasty, Dallas initially but not throughout, The Colbys, Knight Rider, the original Star Trek, Mission Imposible etc.) were shot AND edited on 35mm film - and were only telecined for transmission, or to make a transmission video tape copy. If the original film edits remain then remastering in HD or UHD is a case of cleaning and retransferring.

    Where the original series were shot on 35mm film, but then the film elements were telecined to video tape prior to editing, things are much more difficult (as you have to source the film elements - if they were retained, re-transfer them and then re-edit these new transfers to match the original SD edits)

    If you only have SD videotape copies of content shot on film - you can't really improve on these in the same way - as the artefacts from the 60s-80s telecine and grading are 'baked in', as is the SD resolution.

    As for 16mm and 35mm SD telecine resolution vs contemporary camera resolution - that's a whole different story. The two systems have very different looks too - as almost all material shot on film was shot at 24/25fps with only 24/25Hz motion capture (Max Headroom and a few other series shot at 30fps but that was niche). Video cameras shoot at 50 or 60 (*) images per second, thus capture with 50 or 60Hz motion (the two fields in a 25/30(*) fps video frame are captured 1/50th or 1/60th of a second apart - which is the clever thing about interlace. It gives you the equivalent motion resolution of film shot at 50 or 60fps (which is why 60s-90s video looks 'smoother' than film usually) but only gives you full vertical resolution (**) on static and very slow moving content, with the vertical resolution dropping on moving stuff.

    (*) In reality 59.94Hz/29.97fps because of the NTSC subcarrier vs sound buzz decision made in the 50s.
    (**) Interlaced content is usually vertically pre-filtered to reduced flicker at 25/30Hz (frame rate) so the full 480 or 576 line resolution isn't fully present, but it's still a lot higher than the 240/288 line resolution that fast moving content is captured at.
    Quote Quote  
  16. Member
    Join Date
    Nov 2021
    Location
    Melbourne, Victoria, Australia
    Search Comp PM
    Originally Posted by lingyi View Post
    All films and videotapes take storage space and all storage space costs money. Videotapes weren't cassettes, they were 2' tape on large reels, close to the size of a film reel. in addition, film deteriorates even when kept under proper storage conditions and early film was highly flammable. It's not only TV shows that were destroyed, but numerous theatrical films, sometimes for the small amount of silver nitrate that could be recovered from them.
    But tape can be reused. Also nitrate film (the film that self-ignited when it got too hot) was phased out in the early 1950s, so I'm not aware of such film being used for television.
    Originally Posted by nogginvid View Post
    As for 16mm and 35mm SD telecine resolution vs contemporary camera resolution - that's a whole different story. The two systems have very different looks too - as almost all material shot on film was shot at 24/25fps with only 24/25Hz motion capture (Max Headroom and a few other series shot at 30fps but that was niche). Video cameras shoot at 50 or 60 images per second, thus capture with 50 or 60Hz motion (the two fields in a 25/30 fps video frame are captured 1/50th or 1/60th of a second apart - which is the clever thing about interlace. It gives you the equivalent motion resolution of film shot at 50 or 60fps (which is why 60s-90s video looks 'smoother' than film usually) but only gives you full vertical resolution on static and very slow moving content, with the vertical resolution dropping on moving stuff.
    Indeed there is a difference in motion between anything shot on tape at 25fps and anything shot on film at the same framerate. Even when something shot on film at that rate is transferred to videotape (a direct transfer fitting the 50Hz refresh rate), it still differs in motion from anything shot on tape at the same speed.

    Since Star Trek has been mentioned, I understand Star Trek, being SciFi, was filmed pretty much entirely in a studio using film for television, and therefore filmed under the very best lighting conditions for shooting on film, which raised the question of why it was filmed at 24 frames per second (or maybe 23.976) with 3:2 pulldown being used to fit the E.I.A refresh rate (E.I.A is 480i60) rather than just filming at 29.97 fps and transferring directly.
    My understanding is that shooting at a lower frame rate means a longer exposure time, and longer exposures aren't needed when filming in studios.

    Regarding film, Agfacolor was also available alongside Eastmancolor.

    Using film for television continued even after the advent of non-linear video editing, up until the early 2000s.
    Quote Quote  
  17. Originally Posted by Xanthipos View Post
    Since Star Trek has been mentioned, I understand Star Trek, being SciFi, was filmed pretty much entirely in a studio using film for television, and therefore filmed under the very best lighting conditions for shooting on film, which raised the question of why it was filmed at 24 frames per second (or maybe 23.976) with 3:2 pulldown being used to fit the E.I.A refresh rate (E.I.A is 480i60) rather than just filming at 29.97 fps and transferring directly.
    Lots of reasons.

    1. Cost. Shooting at 24fps is cheaper than shooting at 30fps. You use 25% more film stock if you shoot at 30fps - which means 25% increase in your film stock and film processing budget. That's a major budget hit - and if 3:2 was OK for movies, it was OK for TV.

    2. Standards. Movies are shot at 24fps. Its what all the studio equipment and crews knew and worked with. Changing to 30fps would have made things more complicated.

    3. International sales. 24fps was a universally compatible frame rate for all broadcast frame rates. For 60Hz you used 3:2 - just as you did with movies, for 50Hz you used 2:2 with 4% speed-up, which is tolerable. 30fps would have been nice for 60Hz territories, but a nightmare for sale to 50Hz regions - as prior to high quality frame rate conversion - there was no real way of broadcasting 30fps content at 50Hz in quality. (Max Headroom shot that way because it had so many in-vision CRTs that would have been painful to have implemented in 24fps 'Slow PAL' video that is used for in-vision CRTs at 24fps if they aren't matted - so they shot at 30fps film which allowed regular 60Hz CRT content to be filmed without flicker. Frame rate conversion to 50Hz was by then roughly acceptable - and 30fps film frame rate converted better than 24fps with 3:2 IF DEFT or similar conversion wasn't used)

    The 24fps being compatible with both 50 and 60Hz regions argument is why most episodic TV shot in 60Hz regions is still shot at 24fps (albeit electronically) today.

    This year's Oscars was shot 1080p24 on Sony F55s I believe - then each camera was converted to 720p60 with 3:2 prior to hitting the main production switcher in 720p60... The 'film look' is now 3:2 in the US...
    Quote Quote  
  18. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Agfacolor was european based, and was AFAIK never used in the US commercially (it appears that a few well know films did use agfa, 7brides being the biggest, but they were the exception not the rule). My memory is that it was even quite hard to find for consumers. Plus the whole ecosystem of labs and development was based around the other formulations. And I would venture to say there was a certain amount of avoidance of things foreign, so not surprising, given the time period.

    Excepting Todd-AO, and manually cranked cams, ALL 16, 35mm camera equipment ran at 24fps (possibly 25 in Europe). You couldn't get (inexpensive) gear that worked at any other framerate - the entire ecosystem was built around that framerate. If you were to shoot at a non-standard speed, transfer and playback would just be motion-adjusted (slomo or fastmo), because the rest of the film chain expects the standard rates. Thus nonstandard rates were only used for special effects, or for specialty closed system purposes. Both of which were expensive, so quite unlikely in cost conscious TV production.

    Until the late 90s/early 2000s, there were no video cams that matched framerate of film. They were always 50/60i. I being interlace, so even with framerate the "same", the look would be different because of the motion discrepancy. This has been discussed to death on this site: framerate, color latitude, dynamic range, grain/resolution - there were many differences between the look of the mediums. But wasn't this topic about the resolution superiority of film over video in those early days?

    Why are you arguing now about film editing? Don't need to change the goalposts. Yes, it continued, but it became NOT the quickest, and now not the cheapest.

    Scott
    Last edited by Cornucopia; 27th Nov 2021 at 11:44.
    Quote Quote  
  19. Originally Posted by Cornucopia View Post
    Excepting Todd-AO, and manually cranked cams, ALL 16, 35mm camera equipment ran at 24fps (possibly 25 in Europe). You couldn't get (inexpensive) gear that worked at any other framerate - the entire ecosystem was built around that framerate.
    Europe ran with two different framerates - 24fps for motion pictures, 25fps for television film production. However the history of European TV film production is very different to US TV film production.

    European broadcasters in the 60s-90s made a LOT of their own content in-house, and so would just buy 25fps film cameras to shoot with (no need for them to have 24fps cameras as they were 100% shooting for TV).

    In the US there were motion picture studios who also made content for TV - and I suspect the same cameras were used for both?

    If you were to shoot at a non-standard speed, transfer and playback would just be motion-adjusted (slomo or fastmo), because the rest of the film chain expects the standard rates. Thus nonstandard rates were only used for special effects, or for specialty closed system purposes. Both of which were expensive, so quite unlikely in cost conscious TV production.
    Precisely this - following standards is always cheaper than being non-standard in this regard.

    The reason Max Headroom shot at 30fps was purely to save money elsewhere - and they had a specific workflow that worked for them AIUI.

    Until the late 90s/early 2000s, there were no video cams that matched framerate of film. They were always 50/60i. I being interlace, so even with framerate the "same", the look would be different because of the motion discrepancy. This has been discussed to death on this site: framerate, color latitude, dynamic range, grain/resolution - there were many differences between the look of the mediums. But wasn't this topic about the resolution superiority of film over video in those early days?
    Yes - the first cameras I was aware of professionally where I worked that would shoot p24, p25 and p30 with 'film look' in camera were the DSR DVCam models of the early-mid-00s, and there were HD Cams as well (I wasn't shooting HD at that point)

    Why are you arguing now about film editing? Don't need to change the goalposts. Yes, it continued, but it became NOT the quickest, and now not the cheapest.
    Yep - once non-linear PC-based editing arrived - cutting on film suddenly seemed a LOT less desirable. (At the BBC a lot of 16mm film editors moved to Lightworks or Avid - most preferred Lightworks ISTR)
    Quote Quote  
  20. Member
    Join Date
    Nov 2021
    Location
    Melbourne, Victoria, Australia
    Search Comp PM
    Originally Posted by Cornucopia View Post
    Agfacolor was european based, and was AFAIK never used in the US commercially (it appears that a few well know films did use agfa, 7brides being the biggest, but they were the exception not the rule). My memory is that it was even quite hard to find for consumers. Plus the whole ecosystem of labs and development was based around the other formulations. And I would venture to say there was a certain amount of avoidance of things foreign, so not surprising, given the time period.
    But film was used for European television too.
    Originally Posted by Cornucopia View Post
    Excepting Todd-AO, and manually cranked cams, ALL 16, 35mm camera equipment ran at 24fps (possibly 25 in Europe). You couldn't get (inexpensive) gear that worked at any other framerate - the entire ecosystem was built around that framerate. If you were to shoot at a non-standard speed, transfer and playback would just be motion-adjusted (slomo or fastmo), because the rest of the film chain expects the standard rates. Thus nonstandard rates were only used for special effects, or for specialty closed system purposes. Both of which were expensive, so quite unlikely in cost conscious TV production.
    Plenty of 16mm cameras did have multiple framerate options. For example the Bolex could run between 8 fps and 64.
    Originally Posted by Cornucopia View Post
    Until the late 90s/early 2000s, there were no video cams that matched framerate of film. They were always 50/60i. I being interlace, so even with framerate the "same", the look would be different because of the motion discrepancy. This has been discussed to death on this site: framerate, color latitude, dynamic range, grain/resolution - there were many differences between the look of the mediums. But wasn't this topic about the resolution superiority of film over video in those early days?
    What have 24p video cameras got to do with the frame rate of film used for television?
    Originally Posted by Cornucopia View Post
    Why are you arguing now about film editing? Don't need to change the goalposts. Yes, it continued, but it became NOT the quickest, and now not the cheapest.
    I'm not arguing about film editing.
    Originally Posted by nogginvid View Post
    Europe ran with two different framerates - 24fps for motion pictures, 25fps for television film production. However the history of European TV film production is very different to US TV film production.

    European broadcasters in the 60s-90s made a LOT of their own content in-house, and so would just buy 25fps film cameras to shoot with (no need for them to have 24fps cameras as they were 100% shooting for TV).
    Some cameras do have multiple framerate options
    Originally Posted by nogginvid View Post
    If you were to shoot at a non-standard speed, transfer and playback would just be motion-adjusted (slomo or fastmo), because the rest of the film chain expects the standard rates. Thus nonstandard rates were only used for special effects, or for specialty closed system purposes. Both of which were expensive, so quite unlikely in cost conscious TV production.
    If shot at 29.97 fps for North American or Japanese television, then the playback is not motion adjusted.
    Originally Posted by nogginvid View Post
    Precisely this - following standards is always cheaper than being non-standard in this regard.
    But if nearly all of those using film for North American, Japanese and possible Brazilian television decided to use 30 fps, then wouldn't it become a standard due to the size of the market?
    Quote Quote  
  21. Originally Posted by Xanthipos View Post
    Originally Posted by nogginvid View Post
    Precisely this - following standards is always cheaper than being non-standard in this regard.
    But if nearly all of those using film for North American, Japanese and possible Brazilian television decided to use 30 fps, then wouldn't it become a standard due to the size of the market?
    It's all hypothetical - but it's unlikely that would have been the case. Hollywood and movie production wasn't going to switch to 30fps, so you were adding a new, second standard to an industry already thoroughly wedded to an existing one.

    You'd have ended up with a confusing requirement for all film chains/telecines at TV broadcasters to be dual standard 24fps (for movies) and 30fps (for TV shows shot on film), adding to their cost, and the production costs of TV shows shot on film would have increased significantly due to the increased film stock and processing costs, and for no obvious financial return?

    At the same time you'd be making your shows more more difficult to show in 50Hz territories, which would have made no real commercial sense either.
    Quote Quote  
  22. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    Jeez. It's ALL hypothetical.


    What was a topic about the 'perceived' resolution of film against video tape has taken one or maybe more turns for the worse.


    If you are into this watch the 2nd episode of 'Get Back' - the recording sessions directed(edited) by Peter Jackson of The Beatles docu - which discusses both the film medium - a consciencious decision of 16mm over 32 - and the apparent 'waste' of 8-track recording tape '2 shillings per foot (1969 price)'


    I will concede that there was an 'advantage' in shooting (UK) film at 25 fps AS LONG AS THE UNDERLYING HARDWARE SUPPORTED IT and there was a mix of film and video.


    It matters not if a particular camera could shoot at an higher speed. It matters that the film crew could make the best out of the equipment they had and the equipment beyond the filming stage that would 'gell' all parts together.
    Quote Quote  
  23. Member
    Join Date
    Nov 2021
    Location
    Melbourne, Victoria, Australia
    Search Comp PM
    Originally Posted by nogginvid View Post
    It's all hypothetical - but it's unlikely that would have been the case. Hollywood and movie production wasn't going to switch to 30fps, so you were adding a new, second standard to an industry already thoroughly wedded to an existing one.
    Is the market for film for television in 60Hz countries big enough to sustain its own standard, for TV shows shot on film for those markets? Would how much this additional standard added production cost of North American and Japanese T.V depend on how large that market is?

    Originally Posted by nogginvid View Post
    You'd have ended up with a confusing requirement for all film chains/telecines at TV broadcasters to be dual standard 24fps (for movies) and 30fps (for TV shows shot on film), adding to their cost, and the production costs of TV shows shot on film would have increased significantly due to the increased film stock and processing costs, and for no obvious financial return?
    50Hz countries did have dual standards. Movie studios such as Pinewood used 24 fps for cinematic content and broadcasting corporations like the B.B.C used film at 25 fps for shows shot on film.
    Quote Quote  
  24. Worth noting, once we progressed from film chains to scanners, commercials and some high end music videos were often shot on film at 30fps (29.97) in the US. For some reason, car commercials seemed particularly prone to this treatment. (Pretty sure the late '80s TV show "Sledge Hammer" was as well -- so there are certainly more.)
    Quote Quote  
  25. 50Hz countries did have dual standards. Movie studios such as Pinewood used 24 fps for cinematic content and broadcasting corporations like the B.B.C used film at 25 fps for shows shot on film.
    What are you talking about?
    Quote Quote  
  26. Originally Posted by Xanthipos View Post
    50Hz countries did have dual standards. Movie studios such as Pinewood used 24 fps for cinematic content and broadcasting corporations like the B.B.C used film at 25 fps for shows shot on film.
    Yep - my earlier point about the UK was that in the 60s-90s there wasn't a huge amount of crossover between movie studios and TV productions in the same way that there was in the US where studios made shows for TV broadcast.

    A lot (if not most?) 25fps film shot for UK TV in that era was shot by UK broadcasters using their own equipment - rather than third parties and independents.
    Quote Quote  
  27. Originally Posted by Xanthipos View Post
    Is the market for film for television in 60Hz countries big enough to sustain its own standard, for TV shows shot on film for those markets? Would how much this additional standard added production cost of North American and Japanese T.V depend on how large that market is?
    Well the stock and processing costs were going to be around 25% more than shooting at 24fps (as 30fps would use 6 more frames each second).

    Also the cost of not being able to sell 30fps shows to 50Hz countries easily would also have been significant? There was no nice way of converting 30fps to 50Hz before semi-decent frame rate converters arrived in the mid-70s, and good quality ones in the late 80s/early 90s?

    24fps has the triple whammy of being :

    1. Dominant in an industry already massively important - movie production.
    2. The cheapest format was 24fps as that was the lowest frame rate deemed acceptable for motion pictures with sound. (Any higher frame rates would be additional cost for no major return ?)
    3. 24fps works acceptable for both 50 and 60Hz broadcast - so was a globally compatible standard. 25fps was also - to a degree - but 30fps wasn't.
    Quote Quote