VideoHelp Forum
+ Reply to Thread
Results 1 to 4 of 4
Thread
  1. Member
    Join Date
    Jul 2007
    Location
    United States
    Search Comp PM
    These two recent threads have gotten me to thinking (yes, it’s scary! )…

    https://forum.videohelp.com/threads/381975-Why-do-people-say-TV-shows-shot-on-film-were...849&viewfull=1

    https://forum.videohelp.com/threads/382054-Back-in-time-what-is-lost-in-VHS-copy-of-SD-...cast?p=2473427

    While the best analog videotape formats (2” quad, 1”, etc.) have greater range (gamma, s/n ratio, etc.) than SD TV, why is it that a “live” broadcast (e.g. breaking news, sports, Presidential speeches) directly from the video cameras can look so much better given the limitation of analog systems?

    On the other side of the same token, why is it that with current digital broadcasts, there’s still variations when the same source (e.g. CNN news feeds) is shown on different channels. I can see bandwidth being one factor, but what else if the originating signal is the same.

    I’m reminded of the 60’s and 70’s when a Presidential speech was broadcast on ALL the channels (all three of them) and sometimes the President was from Mars (too green) and other times from Venus (too red or purple). Okay, each station likely used their own cameras and NTSC lived up to Never Twice the Same Color, but why so different?

    Which brings me to pre-broadcast production. Obviously (hopefully), film sources must be adjusted to “fit” within the limits of SD TV. But what about video? Were these also adjusted to (hopefully) maximize the viewing experience or were they just broadcast and let the losses be whatever can’t be retained.

    Taking a time machine back to pre-digital TV days, if I had a 2” or 1” video recording machine at home and a direct line of sight, close proximity to the studio’s live broadcast tower, how good would my recording be compared to the in-studio tapes? I know that analog video will always incur a quality loss when copying, but this would be a "1st genration" recording limited "only?" by the video system.

    Okay, got that out of my head. Now I can think about something else. "Ummm...donuts."
    Quote Quote  
  2. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Lot of questions require a lot of answers...will take a while.
    Quote Quote  
  3. If you're talking about analog live broadcast NTSC video broadcast in the pre-digital era, the complexity of that composite signal, and the reliance on super-precise timing is at the heart of almost every bit of degradation and also all the mis-matching you used to see between different cameras of a live football broadcast, or of different networks that broadcast the same content at the same time (e.g., a presidential inauguration).

    The difficulty in getting the timing signals to each camera was compounded, during big live events, by the miles of cables that separated the camera from the studio switching console.

    Videotape had its own issues, even the 2" Quadruplex technology that was used for so many years. This issues were compounded if the end result was second or third generation. This happened any time there was editing involved because that required copying between two or more decks.

    As for VHS video, those threads you linked to really didn't get into any great detail about the severity of the degradation introduced by that format, but ALL consumer formats (Beta, VHS, 8mm, Hi8, S-VHS) created video that was absolutely atrocious when compared to the original broadcast signal. Comparing broadcast SD analog video to VHS is like comparing 8mm movie film to 35mm: you can get used to watching either one, but when viewed side-by-side, the lesser format looks awful.

    There were other elements that contributed to degradation. For instance, when a live event, like the Academy Awards, was broadcast nationally at the same moment in time, that signal had to get from one side of the country to the other. Someone who actually worked in the broadcast industry will have to describe exactly how that was done, but I believe that it involved either a huge number of microwave tower links (each limited by the curvature of the earth), or coaxial cable. Once again, this degraded the signal.

    I had a few brief opportunities to be in some pretty big TV studios and I can tell you that the live picture, in the studio, on a broadcast monitor, was absolutely stunning, way beyond anything you ever saw in your own home. I also saw similar things at some the 1980s CES shows I attended.
    Quote Quote  
  4. Member
    Join Date
    Aug 2010
    Location
    San Francisco, California
    Search PM
    For one thing, studio video was modulated and transmitted TWICE on the way to your television set. First over the microwave studio-to-transmitter link, and then over the broadcast antenna. Noise is inevitably introduced at each step.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!