I wasn't sure which sub-forum to put this in.
When a studio shoots a film obviously they expect the film to respond a certain way given lighting, camera settings. Clearly there's complex chemistry involved with the production of the emulsion and the celluloid it's sitting on. If you're shooting a film which can be massively expensive, obviously it would be disastrous to find out there was something wrong with the film.
Was the technology so well-refined that bad batches just didn't happen or *were* there such incidents because of film failure? Was the film manufacturer on the hook or did the studios carry insurance to cover such an eventuality?
+ Reply to Thread
Results 1 to 2 of 2
Studios only use(d) bonded developing labs, who maintain very tight control over their chemistry, plus they have throwaway sections to do prep tests with. Plus, they are (usually heavily) insured.
Studios get better stock from the manufacturers than what consumers usually get. It has very high quality control, also with sampled tests in all major batch runs. Also insured.
When using film, directors send each day's worth of shooting to be developed right away, so they can evaluate the quality before they move on to other shooting scenes in their busy schedules. This process is part of what is commonly known as the "dailies" (also "rushes" as they rushed to get it done). This quality eval includes the physical properties of the film, as well as of the captured composition and performance.
Obviously, digital cinema & video formats don't need to develop, but they do need to backup (for safety) and ingest/upload (for review), so the overall process is a very similar habit. Only the time-to-live cycle is reduced from hours to minutes, or in some cases seconds.
Because of the electronic nature of the captured signal, there is the added benefit of live preview by multiple people on confidence monitors. Film didn't have this capability, as only the DP could see the image there (through the optical viewfinder).