VideoHelp Forum
+ Reply to Thread
Results 1 to 21 of 21
Thread
  1. Member brassplyer's Avatar
    Join Date
    Apr 2008
    Location
    United States
    Search Comp PM
    Is it correct that the "native" format for HD is progressive video, that there is circuitry in HD tv's to accommodate interlaced SD DVD's but HD is really progressive?
    Quote Quote  
  2. No. Some Hi-Def TV broadcasts are interlaced, some Blu-Rays are interlaced.
    Quote Quote  
  3. Formerly 'vaporeon800' Brad's Avatar
    Join Date
    Apr 2001
    Location
    Vancouver, Canada
    Search PM
    Not sure what you mean. All fixed-pixel displays (LCD, plasma) have a native progressive resolution and everything is scaled (and deinterlaced if necessary) to fit those pixels.

    But in terms of production standards, there's 1080i.
    Quote Quote  
  4. Banned
    Join Date
    Oct 2004
    Location
    New York, US
    Search Comp PM
    Last edited by sanlyn; 19th Mar 2014 at 11:12.
    Quote Quote  
  5. Member brassplyer's Avatar
    Join Date
    Apr 2008
    Location
    United States
    Search Comp PM
    How does an HDTV present interlaced material? As fields the way an SD tv does or converted to progressive images?
    Quote Quote  
  6. Originally Posted by brassplyer View Post
    How does an HDTV present interlaced material? As fields the way an SD tv does or converted to progressive images?
    Progressive TVs IVTC or deinterlace. Quality of those operations vary from model to model.
    Quote Quote  
  7. Member brassplyer's Avatar
    Join Date
    Apr 2008
    Location
    United States
    Search Comp PM
    Originally Posted by jagabo View Post
    Originally Posted by brassplyer View Post
    How does an HDTV present interlaced material? As fields the way an SD tv does or converted to progressive images?
    Progressive TVs IVTC or deinterlace. Quality of those operations vary from model to model.
    So what's the point of broadcasting 1080i?
    Quote Quote  
  8. The industry wanted a higher resolution than 1280x720p60 but didn't want the bandwidth of 1920x1080p60. Since they were used to interlaced video they decided to go with 1080i30 -- half the bandwidth of 1080p60 (as uncompressed video) and about the same bandwidth as 1280x720p60. In my opinion it was a huge mistake. They should have gone with one progressive resolution (near 1920x1080 but mod16!) and allowed various frame rates: 24, 30 and 60. Since HD broadcast is MPEG 2 a computer is required at the receiving end to decompress it (ie, no pure analog TVs). That means a frame buffer and a graphics card. So frame rate conversion to the TV's native frame rate is trivial. And 60p doesn't take much more bandwidth than 30i with MPEG 2 compression. They missed out on a chance to eliminate interlaced video.
    Quote Quote  
  9. Banned
    Join Date
    Oct 2004
    Location
    New York, US
    Search Comp PM
    Originally Posted by jagabo View Post
    They missed out on a chance to eliminate interlaced video.
    They also missed out on a chance to bypass the majority of TV and player owners around the world who would not be able to watch 20 minutes of non-interlaced commercial advertising for every hour of non-interlaced broadcasting. I can only imagine the vast cost of conversion for public and private concerns.
    Last edited by sanlyn; 19th Mar 2014 at 11:12.
    Quote Quote  
  10. Originally Posted by sanlyn View Post
    Originally Posted by jagabo View Post
    They missed out on a chance to eliminate interlaced video.
    They also missed out on a chance to bypass the majority of TV and player owners around the world who would not be able to watch 20 minutes of non-interlaced commercial advertising for every hour of non-interlaced broadcasting. I can only imagine the vast cost of conversion for public and private concerns.
    Nonsense. First, you needed a new TV to view HD broadcast as HD. For downscaling to SD it's easy to interlace a progressive source.
    Quote Quote  
  11. Banned
    Join Date
    Oct 2004
    Location
    New York, US
    Search Comp PM
    Nonsense . Not everyone is viewing HD as HD, and many aren't looking at HD at all. Most people don't. I know plenty of people who still use a VCR, have no HDTV, don't want one or can't afford it. I service PC's at three rest homes in this county and the next county: no "HD" in sight, anywhere, and there must be 40 or 50 TV's in those joints. They're still using VCR's. Went through a hospital lately, and some of the rooms and lounges still have CTR's (I still have one, too). True, eventually all TV's will be HD of some form, interlace will be a thing of the past. Everyone will be wealthy and will upgrade all of their a/v stuff every 6 months. You and I will be long gone by then.
    Last edited by sanlyn; 19th Mar 2014 at 11:13.
    Quote Quote  
  12. Nobody with an old interlaced analog TV can watch digital broadcast without a converter box that converts 1080i or 720p to 480i. That box could just have well converted 1080p to 480i. So once the decision to go with MPEG 2 digital broadcast was made there was no need for interlaced HD.
    Last edited by jagabo; 21st Dec 2013 at 22:58.
    Quote Quote  
  13. Formerly 'vaporeon800' Brad's Avatar
    Join Date
    Apr 2001
    Location
    Vancouver, Canada
    Search PM
    Unfortunately there was no studio interface standard for pushing around 1080p60 uncompressed until 2002 (SMPTE 372M). It's still a massive amount of data and bandwidth today, never mind with the technology of 15 years ago.
    Quote Quote  
  14. And 15 years ago computer power was doubling every 12 to 18 months. If they had the ability to handle 720p60 or 1080i30 all they had to do was wait or year or two and they could handle 1080p60.

    Remember, they were picking the broadcast standard that will likely be stuck with for the next 40 or 50 years. Not a 1 year consumer electronics cycle.
    Last edited by jagabo; 22nd Dec 2013 at 10:09.
    Quote Quote  
  15. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by jagabo View Post
    And 15 years ago computer power was doubling every 12 to 18 months. If they had the ability to handle 720p60 or 1080i30 all they had to do was wait or year or two and they could handle 1080p60.

    Remember, they were picking the broadcast standard that will likely be stuck with for the next 40 or 50 years. Not a 1 year consumer electronics cycle.
    The original date to switch to DTV only was December 31, 2006, but it was delayed a few time times before it actually took place on June 12, 2009, so even choosing 1080i with lower technical requirements, it took a while to get broadcasters and the consumer electronics industry ready, never mind the public.

    The Federal government wasn't happy about needing to wait. They wanted the lower half of the the VHF TV spectrum freed up so it could be auctioned off to other parties.
    Last edited by usually_quiet; 22nd Dec 2013 at 11:51.
    Quote Quote  
  16. They could have simply defined the standard as 720p and 1080p at various frame rates. Then let the hardware catch up to 1080p60 with time.
    Quote Quote  
  17. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by jagabo View Post
    They could have simply defined the standard as 720p and 1080p at various frame rates. Then let the hardware catch up to 1080p60 with time.
    There is a reason for 1080i. Wikipedia says this: "The ATSC specification also allows 1080p30 and 1080p24 MPEG-2 sequences, however they are not used in practice, because broadcasters want to be able to switch between 60 Hz interlaced (news), 30 Hz progressive or PsF (soap operas), and 24 Hz progressive (prime-time) content without ending the 1080i60 MPEG-2 sequence."

    Wikipedia's ATSC page also goes on to say "The new standards supports 1080p at 50, 59.94 and 60 frames per second; such frame rates require H.264/AVC High Profile Level 4.2, while standard HDTV frame rates only require Levels 3.2 and 4, and SDTV frame rates require Levels 3 and 3.1."

    So broadcasters have the option to use 1080p60 if they implement subchannels using H.264. I think the primary has to be MPEG-2.
    Last edited by usually_quiet; 22nd Dec 2013 at 12:13.
    Quote Quote  
  18. Originally Posted by usually_quiet View Post
    Originally Posted by jagabo View Post
    They could have simply defined the standard as 720p and 1080p at various frame rates. Then let the hardware catch up to 1080p60 with time.
    There is a reason for 1080i. Wikipedia says this: "The ATSC specification also allows 1080p30 and 1080p24 MPEG-2 sequences, however they are not used in practice, because broadcasters want to be able to switch between 60 Hz interlaced (news), 30 Hz progressive or PsF (soap operas), and 24 Hz progressive (prime-time) content without ending the 1080i60 MPEG-2 sequence."
    And they could have done the same 1080p60 (like the have done with 720p60). Stations and consumers that couldn't afford 1080p60 could have used 720p60 until costs came down.
    Quote Quote  
  19. Member vhelp's Avatar
    Join Date
    Mar 2001
    Location
    New York
    Search Comp PM
    because many tv series do not finish in exactly 30/60/90/etc minutes. so they employ time expansion or compression of the video's telecine cadense pattern at the field level. with interlace, its all "i"'s, so its not necessary, but with film (PPPii) it is. many tv series and movies broadcasted are done this way and is why they are not always properly restorable back to 24p.
    Quote Quote  
  20. How do you think broadcasters that currently use 720p60 do their time compression? With 720p60 and 1080p60 you can throw out p's the same way you throw out fields with 1080i. There's no difference in that.
    Quote Quote  
  21. Member vhelp's Avatar
    Join Date
    Mar 2001
    Location
    New York
    Search Comp PM
    that is true, didn't think about it that way.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!