VideoHelp Forum
+ Reply to Thread
Results 1 to 29 of 29
Thread
  1. Member
    Join Date
    Feb 2006
    Location
    United States
    Search Comp PM
    I feel stupid asking this. But everything I read about HD is about 720p or 1080p or 1080i - and since I won't be getting my TV until around November this year I assume it will be more 1080p then 720p. But my question is this... There are a lot of LCD's out there that say they are 768p but when I read about what is true HD everything says 720p or 1080p. So then what is 768p? Is it basically the same as 720p?

    If an HD signal is sent in either 720p or 1080i for it so show on a 768p does it have to go through some additional processing? If so are you better off with a 720 rather than the 768?
    Quote Quote  
  2. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by winterwolfes
    I feel stupid asking this. But everything I read about HD is about 720p or 1080p or 1080i - and since I won't be getting my TV until around November this year I assume it will be more 1080p then 720p. But my question is this... There are a lot of LCD's out there that say they are 768p but when I read about what is true HD everything says 720p or 1080p. So then what is 768p? Is it basically the same as 720p?

    If an HD signal is sent in either 720p or 1080i for it so show on a 768p does it have to go through some additional processing? If so are you better off with a 720 rather than the 768?
    Everything in (analog, 480i, 480p, 720p, 1080i, VGA) gets processed to the native display resolution. 1366x768p or 1920x1080p or other is the native display resolution.

    "Processing" envolves a lot of activity depending on the source and user settings. For example:

    - tuning a SD or HD channel from over the air or cable QAM
    - comb filtering, decoding and levels/noise/aspect correcting analog input.
    - deinterlacing or inverse telecining converted analog or digital input
    - frame repeating 24p to native display refresh
    - aspect ratio and user mode processing
    - finally, scaling the resulting processed signal to the native display resolution (usually with overscan).

    The pixels displayed are very different than the pixels received.

    There is unconfirmed rumor that some HDTV exists that "pixel for pixel" maps a VGA or DVI/HDMI 1920x1080p/24 input from a HD/BD DVD player to a 1920x1080p screen. I doubt this is true because a 1080p/24 direct display would flicker like crazy. Maybe they mean the 1080p/24 input is received and then frame repeated 3 then two to 59.94fps or 3:3 to 72fps without overscan processing. This is still a rumor.
    Quote Quote  
  3. Member Marvingj's Avatar
    Join Date
    Apr 2004
    Location
    Death Valley, Bomb-Bay
    Search Comp PM
    you don't need 1920*1080 pixels to have an hd tv.

    there is a difference between 1080i and 720p. most crt hdtv's have 1080i resolution (a true 1920*1080 pixels), while most plasma or lcds have 720p (1366*768, or 1280*720).

    if it says HDTV on it, it has to have at least 720p resolution. If it has less then that, it will be either a normal tv or an EDTV (enhanced definition tv).

    now there is a lot more between 720p and to1080i, specifically the p/i.

    Yes 1080i has more pixels, but it refreshes them at a slower rate, interlaced, which is what the i stands for. While the 720p refreshes twice as fast as interlaced, using progressive scanning instead.

    now as for which is better, its all up to preference, supposedly 720p is better for faced paced things, while 1080i is better for slower things, where you want more detail in each frame, but its really up to debate, and please don't start asking about it here, it will only lead to a huge debate...
    http://www.absolutevisionvideo.com

    BLUE SKY, BLACK DEATH!!
    Quote Quote  
  4. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by Marvingj
    you don't need 1920*1080 pixels to have an hd tv.

    there is a difference between 1080i and 720p. most crt hdtv's have 1080i resolution (a true 1920*1080 pixels), while most plasma or lcds have 720p (1366*768, or 1280*720).

    if it says HDTV on it, it has to have at least 720p resolution. If it has less then that, it will be either a normal tv or an EDTV (enhanced definition tv).
    Most HD CRT's see 1920x1080i internally but actual perceived display resolution is less than 1000x1080i (limited by phosphor pitch and shadow mask). Kell factor states perceived vertical resolution is much lower than that for an interlace source so most HD ready CRT's fall into the 800x600 to 960x768 range for actual resolution. The good news is they display analog and 480i source much better than most progressive plasmas and LCD's. Also 800x600 is more than enough for progressive 720x480 DVD. 720p and 1080i will be softer than source but for display sizes below 34" you won't see a significant difference.

    Sony made a few consumer high end CRT's that would reach 1440x1080p but these were very bulky and heavy. You also had to sit very close to see a difference.
    Quote Quote  
  5. Member Marvingj's Avatar
    Join Date
    Apr 2004
    Location
    Death Valley, Bomb-Bay
    Search Comp PM
    HD is also defined as twice the pixel count of SD(640x480). So seeing how 640x480 has only 307,200 pixels...anything with 614,400 pixels or more is technically HD. Now what the various standards are is another story.

    Technically, 1080i/p is HD and 720p is not. However...most people consider 720p to be HD.

    imo, it doesn't matter. People will get the best their money can afford and use the display to its fullest whether that's 720p, 1080i, or 1080p. A few years ago, 480p was called HD. It looked better than 480i...but technically we all know it's wrong to call it HD...same can be said about 720p...but in the end it looks better and that's what matters.
    http://www.absolutevisionvideo.com

    BLUE SKY, BLACK DEATH!!
    Quote Quote  
  6. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by Marvingj
    HD is also defined as twice the pixel count of SD(640x480). So seeing how 640x480 has only 307,200 pixels...anything with 614,400 pixels or more is technically HD. Now what the various standards are is another story.

    Technically, 1080i/p is HD and 720p is not. However...most people consider 720p to be HD.
    Where did you get that?

    The industry has been hard pressed to define a lower limit for HD. They would be happy to say anything that resolves better than 720x480/576 DVD is an HDTV so long as it accepts a 720p or 1080i input.

    1280x720p has always been considered an HDTV standard. Some governments in PAL regions have deemed to call 720x576P HDTV. In NTSC markets, similar 720x480P is called EDTV.
    Quote Quote  
  7. Where did you get that?

    He made it up.
    Quote Quote  
  8. Member Marvingj's Avatar
    Join Date
    Apr 2004
    Location
    Death Valley, Bomb-Bay
    Search Comp PM
    I watch most standard definition content scaled to 1440x720p72, and it's really no comparison at all to the same viewed at 480i or p. Any experience watching quality SD content (like DVD) on any of the newer 720p or 1080p FP digitals will really render moot any arguments to the contrary that upscaling does not provide benefits or even degrades the image (at least if you're sitting reasonably close, i.e. not 2xSW away or something like that). I've merely attempted to explain here theoretically why upscaling content to a theoretically infinite resolution to display it is generally preferred for continuous-tone images. Of course, it would be much easier to explain this visually by just showing you images in person and really anyone would just go "oh, that looks so much better!" but unfortunately I cannot teleport people to where I am. I am only capable of teleporting myself around, and that's priveledged information.
    http://www.absolutevisionvideo.com

    BLUE SKY, BLACK DEATH!!
    Quote Quote  
  9. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    OK, was anyone saying different? My point is every input is scaled/processed to 768p even 720p and 1080i.

    Upscale from 480p/576p DVD or PAL broadcast is a more direct process than upscale from analog NTSC or 480i DVD. The latter suffer from telecine unless inverse telecined before upscale.

    Non-film 480i and 576i need a quality deinterlace before uscale. The big sceen magnifies the deinterlace errors.
    Quote Quote  
  10. Member Marvingj's Avatar
    Join Date
    Apr 2004
    Location
    Death Valley, Bomb-Bay
    Search Comp PM
    Why are there 18 standards for HDTV?

    First, to be technical, there are eighteen different %uFFFDATSC DTV (digital television) formats%uFFFD of which only six are actually High-Definition. Of the six HDTV formats only two are used frequently, 720p and 1080i. That doesn%uFFFDt really answer the question, but it narrows the field down a bit and gives us something to work with.

    The answer comes down to what type of content the broadcaster is looking to optimize. We all know that 1080i has the higher resolution, so why bother offering another format like 720p? While it%uFFFDs true that 1080i has a greater number of pixels (1920 x 1080 vs. 1280 x 720), 720 has two things working to its advantage. First, 720p is a progressive signal. Second, 720p is 60 fps (frames per second). 1080i, on the other hand, is interlaced and 30 fps (60 fields per second).


    Where does this matter?

    It matters for fast movement (e.g. sports). Let%uFFFDs look at an example using both 720p and 1080i:

    Suppose that we have a tennis ball moving across the screen for 1 second. A broadcast in 720p is going to show 60 complete images of the tennis ball. Think of it like an old-fashioned flipbook that has 60 pages. Each page will have a complete image and when you quickly flip through the entire book it will give you movement. This is much like how traditional film works (albeit with 24 fps).

    If we were to do the same experiment with 1080i, it would be quite different. Unlike progressive formats, which show the whole picture, interlaced material relies on the fact that two half-pictures will generally combine to make one whole picture. As such, 1080i will display the even lines for 1/60th of a second followed by the odd lines for 1/60th of second. If we return to the flipbook example, we can see that the book will still have 60 pages but each page will look a little like we%uFFFDre looking through mini-blinds. Of course, when it%uFFFDs sped up it doesn%uFFFDt look like this. A combination of the display (afterglow) and the mind combine to complete the picture.

    %uFFFDBut can%uFFFDt you de-interlace a 1080i signal and have the best of both worlds?%uFFFD (De-interlacing is the process of converting an interlaced signal into a progressive signal by combining the even lines and the odd lines to form one solid picture) Yes, but even in the best case you are only getting 30 fps (half the frames of 720p). In the worst case, the even lines and the odd lines don%uFFFDt quite match up. For instance, assume that the camera is capturing half the picture every 1/60th of a second. In that case, it%uFFFDs possible that the ball has moved enough in that short amount of time that the odd lines don%uFFFDt align with the even lines.
    http://www.absolutevisionvideo.com

    BLUE SKY, BLACK DEATH!!
    Quote Quote  
  11. Originally Posted by Marvingj
    Why are there 18 standards for HDTV?
    Because 17 of them should have never been created.

    That being said, you must also considering that if you have a 24p or 30p source, regardless if the display is 1080i or p, it will aways have more resolution than 720p.

    1 progressive frame will have the exact same resolution when refreshed on either 1080i or 1080p.

    The difference only applies to 100% interlace recorded source.
    Quote Quote  
  12. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by Guiboche
    Originally Posted by Marvingj
    Why are there 18 standards for HDTV?
    Because 17 of them should have never been created.

    That being said, you must also considering that if you have a 24p or 30p source, regardless if the display is 1080i or p, it will aways have more resolution than 720p.
    Where do you get these 24p or 30p sources?
    Are you talking about inverse telecined 1080i?

    As for the other ATSC standards, each has potential for development.
    http://www.hdtvprimer.com/ISSUES/what_is_ATSC.html


    Somebody might want to buy up UHF stations and start broadcasting 1080p/24. This is possible but consumes all the single channel ATSC bandwidth. Major commercial stations will continue to do multicast.

    When analog is shut off, I'd like to see 480p gradually replace 480i. All ATSC tuners are required to receive 480p/24, 480p/30 and 480p/60. Cheap SD Digital TV sets can receive 480p yet still display interlace. One ATSC station could transmit four DVD quality 16:9 720x480p/24 movies at the same time.

    Another unused format is 720p/24. You could transmit two 1280x720p/24 movies plus one 480i/p from the same station.

    If and when H.264/VC-1 gets added to the standard, the number of subchannels increases 2x-3x.
    Quote Quote  
  13. Member turk690's Avatar
    Join Date
    Jul 2003
    Location
    ON, Canada
    Search Comp PM
    ...and so on. to winterwolfes, it's true that a lot of LCD TVs for now have a native resolution of 1366x768. The 768 qualifies it as an HD display, but since that number is not exactly any of the known resolutions you are likely to feed it (1080, 720, 576, & 480), ALL of these signals will necessarily have to be scaled to 768. IMO, the circuitry in the TV performing this scaling has the most important role in predicting, and ultimately showing, the level of quality of the picture. This is why some scaler designers are much sought after, with their name on the product promising high-quality pictures (with equally high prices), like Faroudja. But some LCD TV manufacturers have their own proprietary designs they claim are as good as if not better, and so ultimately, you will have to feed your choice of LCD TV (or any fixed-pixel display) with known 1080i and 720p (in addition to 480i/p & 576i/p) signals to see how they stand your visual test. According to some tests in Consumer Reports & elsewhere, some such TVs can, for example, display 1080 better than 720; others the other way around; still others display standard-definition better than HD.
    to edDV, it's true that 24fps flickers like crazy, but film projectors open the shutters twice for a frame, so the perceived rate is then 48fps. It doesn't matter that there is the same information twice in a row, but our brain thinks we see two distinct pictures one after the other and helps reduce the flicker greatly. Maybe they'll do a similar thing with 1080p/24?? HD (at least in the US) is exactly 30fps and NOT 29.97 anymore. 29.97fps was borne merely out of necessity in keeping all relevant frequencies a multiple of the color subcarrier 3.579545MHz, which ATSC is not beholden to anymore.
    For the nth time, with the possible exception of certain Intel processors, I don't have/ever owned anything whose name starts with "i".
    Quote Quote  
  14. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by turk690

    to edDV, it's true that 24fps flickers like crazy, but film projectors open the shutters twice for a frame, so the perceived rate is then 48fps. It doesn't matter that there is the same information twice in a row, but our brain thinks we see two distinct pictures one after the other and helps reduce the flicker greatly. Maybe they'll do a similar thing with 1080p/24??
    HDTV does it the same way as progressive DVD. In the case of ATSC (the American standard) the 24fps (23.976 actually) is frame repeated 3 times then 2. This results in a 59.94 frame per second display refresh.

    PAL areas run the "film" frames fast at 25 fps and the frame repeat 2:2 for a 50 frame per second display refresh. Special "100fps" displays double that repeat rate for 100 frame per second display refresh.


    Originally Posted by turk690
    HD (at least in the US) is exactly 30fps and NOT 29.97 anymore. 29.97fps was borne merely out of necessity in keeping all relevant frequencies a multiple of the color subcarrier 3.579545MHz, which ATSC is not beholden to anymore.
    ATSC can be run at 30fps but most broadcasting still uses 29.97 frames (59.94 fields) per second to maintain backwards compatibility for programming and to allow analog NTSC distribution through cable or dbs systems. 720p is broadcast at 59.94 frames per second.
    Quote Quote  
  15. Originally Posted by edDV
    Where do you get these 24p or 30p sources?
    Are you talking about inverse telecined 1080i?
    Oh, no. I was just referring to progressive images being played back on 1080i or p.

    Since people will argue that progressive content has higher resolution on 1080p. Both have the same resolution. The only difference is how the image is drawn to the screen.
    Quote Quote  
  16. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by Guiboche
    Originally Posted by edDV
    Where do you get these 24p or 30p sources?
    Are you talking about inverse telecined 1080i?
    Oh, no. I was just referring to progressive images being played back on 1080i or p.

    Since people will argue that progressive content has higher resolution on 1080p. Both have the same resolution. The only difference is how the image is drawn to the screen.
    For a still image this would be true. In motion, perceived vertical resolution drops for an interlace displayed image. Deinterlacing results in both resolution loss and artifacts.

    Telecine mixed frames for 1080i/29.97 further reduce perceived resolution for film sources in scenes with motion but the human eye tends to discount detail when the picture is in motion.

    All this difference disappears for a film source 1080i/29.97 movie that is properly inverse telecined to 1080p/23.976, frame repeated to 59.94 and displayed as 1080p/59.94. In theory, it should look identical* to a 1080p/23.976 source that is frame repeated to 1080p/59.94 for display.


    * I'm ignoring bitrate here. 1080i/29.97 has a 20% bit rate penalty to deliver the same quality as 1080p/23.976.
    Quote Quote  
  17. Member Marvingj's Avatar
    Join Date
    Apr 2004
    Location
    Death Valley, Bomb-Bay
    Search Comp PM
    1080i, the former king of the HDTV hill, actually boasts an identical 1,920x1,080 resolution but conveys the images in an interlaced format (the i in 1080i). In a tube-based television, otherwise known as a CRT, 1080i sources get "painted" on the screen sequentially: the odd-numbered lines of resolution appear on your screen first, followed by the even-numbered lines--all within 1/30 of a second. Progressive-scan formats such as 480p, 720p, and 1080p convey all of the lines of resolution sequentially in a single pass, which makes for a smoother, cleaner image, especially with sports and other motion-intensive content. As opposed to tubes, microdisplays (DLP, LCoS, and LCD rear-projection) and other fixed-pixel TVs, including plasma and LCD flat-panel, are inherently progressive in nature, so when the incoming source is interlaced, as 1080i is, they convert it to progressive scan for display.
    http://www.absolutevisionvideo.com

    BLUE SKY, BLACK DEATH!!
    Quote Quote  
  18. HD is also defined as twice the pixel count of SD(640x480). So seeing how 640x480 has only 307,200 pixels...anything with 614,400 pixels or more is technically HD. Now what the various standards are is another story.

    Technically, 1080i/p is HD and 720p is not. However...most people consider 720p to be HD.

    imo, it doesn't matter. People will get the best their money can afford and use the display to its fullest whether that's 720p, 1080i, or 1080p. A few years ago, 480p was called HD. It looked better than 480i...but technically we all know it's wrong to call it HD...same can be said about 720p...but in the end it looks better and that's what matters.
    Your argument is invalid.

    720p is considered HD. (1280x720 = 921600 pixels)
    1080i/p is known as FULL HD. (1920x1080 = 2073600)


    I will just add...you can repeat the frames all you like, but it will never get rid of the stuttering motion. The original 24fps can not capture fluid motion. Even film transfered to Blu-Ray and HD-DVD still doesn't look fluid. I still see stuttering in fast action scenes and when panning. The only way around this, is to film at higher frame rates. Of course it looks like we will never see higher frame rates in the theaters. The industry just doesn't want to spend the money to replace projectors in theaters.

    720p60 gives you full 60fps, as long as the source was filmed at that speed. True 60 individual frames per second. No frame repeating.
    1080p in it's current form doesn't support 60fps. The bandwidth requirements are much greater. But I'm hoping in the future, they will be able to tack on this option to newer TV sets and retain backwards compatibility with older sets.
    Quote Quote  
  19. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Yes there are multiple ways to store and transmit the video (e.g. 1080i/29.976, 1080i/25, 1080p/59.94, 1080p/50, 1080p/29.97, 1080p/25, 1080p/23.976) and multiple ways to display (e.g. interlace, fixed pixel progressive, etc.)

    To complicate further, film source is handled differently than live source.
    Quote Quote  
  20. Member Marvingj's Avatar
    Join Date
    Apr 2004
    Location
    Death Valley, Bomb-Bay
    Search Comp PM
    To be considered HD, it must be able to, in theory, fully resolve at least 720p (which is easier than 1080i).

    In digital displays, this means it must have at least 1280x720 pixels, and must be able to understand a 720p or 1080i feed (either).

    This being said, some manufacturers "cheat" by assuming customers don't know that 720p also requires 1280 pixels in width. Thus, when you hear of a 42-43" plasma being HD, they're only talking about the 720 portion. The panasonic, NEC, and Pioneer are all 1024x768 with non-square pixels. They can fit the 720 (since it's 768 pixels tall), but the 1280 requirement is completely ignored.

    Sony, Hitachi, and other ALiS based panels are even worse, they're an INTERLACED 1024x1024, which means that the progressive equivalent is basically 1024x512. They still claim to be HD.

    Where it gets really muddy is CRT devices. Often, especially with tube TV's, they simply cannot resolve the full 720p signal. The tubes simply cannot do it, the dot pitch is insufficient, or the video amplifier does not have the bandwidth (it takes a very minimum of about 37.5mhz). Even fully calibrated, the vast majority of tube TV's cannot do 720p, and 7" gun based RPTV's struggle to. It's a dirty little secret in the industry.
    http://www.absolutevisionvideo.com

    BLUE SKY, BLACK DEATH!!
    Quote Quote  
  21. Member Marvingj's Avatar
    Join Date
    Apr 2004
    Location
    Death Valley, Bomb-Bay
    Search Comp PM
    Yeah, things are kinda murky that way.

    For instance, let's take a look at one of the best direct view sets currently available for a reasonable price, the Princeton Digital Arcadia AR3.2FTX.
    It's a 34" widescreen model, with 32" viewable.

    It has 65mhz of video bandwidth, easily enough for 720p. So we're set there.

    CRT quality we cannot measure with specs, so we'll skip over that for a second.

    It has a .90mm diagonal dot pitch. If we assume the horizontal to vertical ratio is the same for those as their Ultra series of CRT's (ie. .27mm diagonal to .23mm horizontal and .1414mm vertical), we get:

    .90mm diagonal
    .7667mm horizontal (.03")
    .4713mm vertical (.018555")

    With a 32" 16:9 screen, we have a 27.9" screen width and 15.69375" screen height.

    Calculating this out, we have an absolute maximum theoretical resolution, screen pitch limited, of:

    27.9" / .03" = 930 pixels wide
    15.69375" / .018555" = 846 pixels tall

    930 x 846 max resolution

    Remember, this is in theory. In practice, the center of the screen tends to have a better dot pitch and the edges get a lot worse. Also in practice, the tubes often are the limiting factor.
    All this aside, resolution is not the end and and be all of video quality. Things like color, shadow detail, black level, grayscale accuracy, and others tend to make a greater impact. It's sort of like how a 3 megapixel digicam with a great lens seems to take better pictures than a 5 megapixel with a poor lens.

    The matterhorn DMD is a very good piece, and the Infocus ScreenPlay 5700 is a great projector. You'll get an excellent picture out of it, and at typical viewing distances, I'm willing to bet most people won't see the difference between that and the 7200 (with 1280x720 native resolution).

    One of my displays is a 42" ED plasma. 852x480 native res. I'm using Voom HDTV service, so DiscoveryHD on that is even less compressed than through Dish, DirecTV, or HD cable. At 9' or so away and further, I cannot tell the difference between a 1024x768 plasma and this one with HD feed.
    http://www.absolutevisionvideo.com

    BLUE SKY, BLACK DEATH!!
    Quote Quote  
  22. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by Wile_E

    I will just add...you can repeat the frames all you like, but it will never get rid of the stuttering motion. The original 24fps can not capture fluid motion. Even film transfered to Blu-Ray and HD-DVD still doesn't look fluid. I still see stuttering in fast action scenes and when panning. The only way around this, is to film at higher frame rates. Of course it looks like we will never see higher frame rates in the theaters. The industry just doesn't want to spend the money to replace projectors in theaters.
    We are stuck with the legacy of 24 fps film and the film industry has no intention to change. They have mastered the techniques needed to make 24 fps work for production and have countless film schools and craft unions committed to staying 24 fps. The "artists" praise the "film look" abstraction of studdered 24Hz motion and deride the crass realism of 50/60 Hz video.

    24fps post production and distribution issues are now solved as well. A single 24fps edit can be automatically converted to all the current and legacy distribution formats.

    The other thing we are stuck with is a video world split between 59.94 and 50 fps transmission rates. At least the film and video worlds have standardized on 1280x720, 1920x1080, 2kx4k and 4kx4k frame sizes if not aspect ratios and frame rates.
    Quote Quote  
  23. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by Marvingj
    To be considered HD, it must be able to, in theory, fully resolve at least 720p (which is easier than 1080i).

    In digital displays, this means it must have at least 1280x720 pixels, and must be able to understand a 720p or 1080i feed (either).
    The HDTV specs all concern transmission and storage. I don't think there is a spec on displays. The industry can do what they want. It is the public that is confusing HD transmission standards with display quality when the two are unrelated. For instance a basic ATSC digital TV set must receive 720p/1080i but it displays letterbox from a 640x480 frame buffer. http://www.bestbuy.com/site/olspage.jsp?skuId=7601838&type=product&productCategoryId=p...=1130982274848
    http://www.bestbuy.com/site/olspage.jsp?skuId=7631271&type=product&productCategoryId=p...=1130986501521

    The TV set industry has done nothing to correct the misconceptions of the public and have carefully avoided publishing resolution specs for CRTs and projectors. They do publish specs for fixed pixel displays but are vague about processing.
    Quote Quote  
  24. Member Marvingj's Avatar
    Join Date
    Apr 2004
    Location
    Death Valley, Bomb-Bay
    Search Comp PM
    I agree...The question for display quality then becomes: “How good is good enough?” Broadcast professionals are expected to be the experts on this question. And the image quality viewers expect to see on their own TV sets will affect the equipment, programming and even creative decisions made in stations and production houses.

    As price and technical capability merge, what should consumers expect from the new display technology, and how should performance be evaluated?
    http://www.absolutevisionvideo.com

    BLUE SKY, BLACK DEATH!!
    Quote Quote  
  25. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by Marvingj
    I agree...The question for display quality then becomes: “How good is good enough?” Broadcast professionals are expected to be the experts on this question. And the image quality viewers expect to see on their own TV sets will affect the equipment, programming and even creative decisions made in stations and production houses.

    As price and technical capability merge, what should consumers expect from the new display technology, and how should performance be evaluated?
    Curious that most broadcast production pros monitor on CRT or lower res "broadcast quality" LCD because they don't worry about resolution, they worry about lighting, color balance, levels and compression/processing artifacts. When they are done they may look at it on a big screen. The process needs to be designed for resolution.
    Quote Quote  
  26. Member Marvingj's Avatar
    Join Date
    Apr 2004
    Location
    Death Valley, Bomb-Bay
    Search Comp PM
    So True.....We are crossing a dividing line in the technical business of television and post production. The line is a sharp separation between what we've known as television and what will become of it. Producers ask about the future value of production material as they realize the images they create are mortal in the face of technological change. The question of how to prepare production material for future generations is an excellent one which strikes at the very heart of what the Digital Television future is all about.
    http://www.absolutevisionvideo.com

    BLUE SKY, BLACK DEATH!!
    Quote Quote  
  27. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by Marvingj
    So True.....We are crossing a dividing line in the technical business of television and post production. The line is a sharp separation between what we've known as television and what will become of it. Producers ask about the future value of production material as they realize the images they create are mortal in the face of technological change. The question of how to prepare production material for future generations is an excellent one which strikes at the very heart of what the Digital Television future is all about.
    This is why those expecting to market into the future have almost always shot on 24 fps film. Video standards have been too transitory. This is beginning to change with HDTV production. Many TV series are now shot and mastered in HDTV but usually at 24p if world distribution is the goal.

    The "industry" right or wrong refused to budge on 24p camera masters. HDTV production cameras have been around since the 80's. It has been a long sell to the film community.

    Bottom line: They will only accept an HDTV camera that can match or better a 24fps film camera at it's own game. It needs to show cost, workflow or performance advantage.

    Lucas used HDTV for Star Wars II and III mainly for workflow advantages. Michael Mann used HDTV acquisition for "Collateral" because it worked for night shooting in LA under street lights. Robert Rodriguez used it in Sin City for workflow and cost.

    HDTV cameras are becoming an alternative to film. Someday they will dominate.

    A partial list of HDTV "films"

    Title Director Director of Photography Recording Format(s)
    "Miami Vice" Michael Mann Dion Beebe ASC, ACS 35mm, s.two,
    HDCAM SR
    (4:2:2)
    "Superman Returns" Bryan Singer Newton Thomas Sigel, ASC HDCAM SR
    (4:4:4-RGB)
    "Click" Frank Coraci Dean Semler ASC, ACS HDCAM SR
    (4:4:4-RGB)
    "Apocalypto" Mel Gibson Sean Semler, ACS, ASC HDCAM SR
    (4:4:4-RGB)
    "Flyboys" Tony Bill Henry Braham, BSC HDCAM SR
    (4:4:4-RGB)
    "Prairie Home Companion" Robert Altman Ed Lachman, ASC HDCAM SR
    (4:2:2)
    "Ultraviolet" Kurt Wimmer Arthur & Jimmy Wong HDCAM SR
    "The Cave" Bruce Hunt Ross Emery, ASC
    Wes Skiles (Underwater) 35mm, HDCAM SR
    "Domino" Tony Scott Daniel Mindel 35mm, HDCAM SR
    "Sin City" Robert Rodriguez Robert Rodriguez HDCAM SR
    (4:4:4-RGB)
    "Revenge of the Sith" George Lucas David Tattersall, BSC HDCAM SR
    (4:4:4-RGB)
    "The Adventures of Shark Boy & Lava Girl in 3-D" Robert Rodriguez Robert Rodriguez HDCAM SR
    (4:4:4-RGB)
    "Bubble" Steven Soderbergh Steven Soderbergh HDCAM SR
    "Mission Impossible 3" J.J. Abrams Dan Mindel 35mm, HDCAM SR
    "Chronicles" David Fincher Harris Savides, ASC s.two
    (4:4:4-RGB)
    "Silence Becomes You" Stephanie Sinclaire Arturo Smith s.two
    (4:4:4-RGB)
    "La Maison Du Bonheur" Dany Boon Jean-Marie Dreujou, AFC HDCAM SR
    (4:4:4-RGB)
    "Scary Movie 4" David Zucker Thomas E. Ackerman, ASC HDCAM SR
    (4:4:4-RGB)
    Quote Quote  
  28. Member
    Join Date
    Jan 2008
    Location
    United States
    Search Comp PM
    Originally Posted by Marvingj
    So True.....We are crossing a dividing line in the technical business of television and post production. The line is a sharp separation between what we've known as television and what will become of it. Producers ask about the future value of production material as they realize the images they create are mortal in the face of technological change. The question of how to prepare production material for future generations is an excellent one which strikes at the very heart of what the Digital Television future is all about.
    Hey! I wrote that and it's really old! Make up your own text!
    http://www.henninger.com/library/hdtvfilm/

    ...and while you're at it, read the updated version, also really old:
    http://www.henninger.com/library/hdtvfilm24/
    Quote Quote  
  29. Always Watching guns1inger's Avatar
    Join Date
    Apr 2004
    Location
    Miskatonic U
    Search Comp PM
    Regardless of the quality of the source, if the screen isn't up to the task, who cares. There are a lot of inferior 1080p screens around, and a lot of very good 720p screen going for around the same price (or more). A Pioneer Kuro running at 1366 x 768 is visually as rich as 90% of 1080p screens on the market at the present time, simply due to the superior technology and processing it uses. Unless you sit 2 inches from the screen with a magnifying glass, most won't see the difference.
    Read my blog here.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!