Because bit rate is more important to picture quality than resolution. 1920x1080 takes 33% more bit rate for equal quality vs 1440x1080. Likewise 1280x720 takes 33% more bit rate for equal quality vs 960x720.Originally Posted by tmw
H scale can be done with lower quality loss vs vertical scale because of interlace issues.
There are no commonly used broadcast formats above 1440x1080i so anything you see at 1920x1080 has been h or HV scaled. BluRay digitally transferred movies have been processed as 1920x1080p/24 but older BluRay movies have been upscaled from 144Mb/s 1440x1080 HDCAM masters.
The ATSC chose square pixel 1920x1080i and 1280x720p with the expectation that future improvements in codecs will allow better picture quality at lower bitrates. They left room for improvement without need to change out the end user display. The tuners/decoders are expected to be subject to more frequent upgrade.
+ Reply to Thread
Results 91 to 96 of 96
There's also the argument that the optics and sensor are so "soft" anyway that "1920 native" vs "1440 converted back to 1920" look the same - there's no more detail in the 1920 native version.
HDV captures 1440x1080 at 25Mbps. All things being equal, you'd need 33Mbps to maintain the same encoding quality at 1920x1080.
I will never trust any camera based on HD
Originally Posted by 2Bdecided
There is nothing new about this. SD DV and DVD have used 720x480 instead or 852x480 for the same reasons.
To be fair, legacy interlaced analogue TV has a lower resolution vertically than the number of lines would suggest due to interlacing, and the horizontal resolution is chosen to match this, giving equal actual resolution both ways, even though the numbers are not equal.
720 is the first number that satisfies Nyquist plus "a bit" for all legacy TV systems while giving an integer number of pixels per line for both "PAL" and "NTSC".
The move from 1440 to 1920 is well underway here (has already happened in many instances).