720x480 is not a SQUARE PIXEL 4:3 ratio well you rigth but ture TV resolution is a small then 640x480 and 640x480 is really know as VGA but this really boil down to fps.
NTSC - 525x480, 30f/s, interlaced.
PAL - 625x480, 25f/s, interlaced.
VGA - 640x480, 60f/s, noninterlaced
+ Reply to Thread
Results 31 to 36 of 36
-
-
Originally Posted by FOO
D1 is 720x486 in reality -- it was with the perception video cards (and others- when we moved form analog to digital editing) that dropped 6 to make it 480 - as well as Dv and other formats .. now 480 and 486 is used both -- but in true D1 - its 486 ,, but as i said - now it doesnt mater much .. there is also cropped D1 (704 X xxx) .
so why did the 6 get dropped - it was because of computers and editing on them -- remember for MANY years - all video editing was done on tape and analog - then came along video toasters (amiga), and big assed workstations costing tonnes of cash - as well as some speciality digital hardware.. later came pc's (mostly mac at first for video - avid) ..
with this came aspect ratio HELL
**D1 was one of the first digital standards for NTSC video, and it was defined as 720x486 pixels, with the same frame rate and field-based schema as analog video. In other words, when digitizing an analog signal into D1 format, the signal is broken into 720 pixels across, 486 pixels north and south. However, this is per FRAME. Since each frame is divided into two fields, one containing uneven lines (1, 3, 5, 7), the other containing even lines (2, 4, 6, 8), each FIELD is really 720x243. As you probably know, your TV set actually displays 60 fields per second, first the uneven lines in the first 60th of a second, then the even lines in the next 60th, and so on. This promotes the appearance of smooth motion with roughly 30 complete frames per second.
What's interesting about the 720x486 format is that if you do the math, the pixels in the frame have an aspect ratio of 4:2.7. (To calculate this, divide 720 by 4 and get 180, then divide 180 into 486 to get 2.7). However, NTSC television has a display aspect ratio of 4:3. That's why if you measure the active screen on your television set, you'll get results like 8"x6" for the 10" diagonal TV set in the kitchen, and 16"x12" for the 20" set in the bedroom. Do the math on both of these sizes, and you get 4:3 (e.g. 16/4 =4, 12/4=3).
So how does a 720x486 format with a frame aspect ratio of 4:2.7 display on a TV set with a 4:3 display aspect ratio? Well, it squeezes each line horizontally by about 11%. That's why NTSC pixels are said to be rectangular. Similarly, if you display a 720x486 D1 image (or even a 720x480 DV image) on an NTSC television set, it will shrink horizontally by about 10%, and look like it's about 640x480 in resolution.
However, in most instances, computer monitor pixels are of even height and width, which is why they are called square pixels. Display that same 720x480 image within a computer screen set to 1024x768 resolution, and the 720x480 image looks like …. well, 720 by 480.
OK, back to MPEG, and D1, its 720x486 starting point. Remember that in the early 1990s, when MPEG-1 was formulated, the fastest CD-ROM drives read at about 150 KB/second, so MPEG-1 had to meet these rates. Even today, encoding full screen video to 150 KB/second produces poor quality results, at least for the television audience towards which MPEG-1 was targeted.
For this reason, the MPEG-1 committee decided to use what's called SIF, for (Standard Interchange format), which split the horizontal and vertical resolutions in half, and then rounded off to the nearest multiple of 16, which best matched MPEG-1's encoding scheme. The committee also opted for frame-based operation as opposed to fields, which yielded a 352x240x30 fps second standard.
As we mentioned initially, MPEG-1 was designed for display on television sets. MPEG-1 files have spaces in the file header to define the frame aspect ratio, directing the CD-I and VideoCD devices (that MPEG-1 targeted) to automatically display in the proper 4:3 frame aspect ratio. The early MPEG-1 hardware decoders that shipped in the 1996 and 1997 did the same thing, in essence shrinking the 352x240 image down to 320x240 for display on a computer screen.
However, hardware players never really took off, leaving most computer-based playback to software players, most notably Microsoft's Media Player, which didn't shrink the video, producing an image that looked 10% too wide. Instead, Media Player displayed all 352x240 pixels, expanding the video by about 10%, and adding about 10-20 pounds (TV makes you look fat).
Why didn't Media Player shrink the video? Originally, it was because the underpowered computers of the day had a hard enough time simply decoding the video, much less scaling it down to 320x240. Remember, this was before all computers had graphics cards that could independently scale the video without involving the CPU.
So Media Player merely decoded and displayed the video in its native resolution. The image was distorted by about 10%, but at least it played.
some say its because microsoft never understood or where to cheap to fix it -- but a lot had to do with designing for the lowest common system .
Interestingly, even today, though computers and their high-powered graphics subsystems are more than capable of decoding and scaling MPEG-1 files to the proper resolution, Media Player, now up to version 9 still ignores the aspect ratio instructions in the file header and displays the file at its native resolution, as does RealNetwork's RealPlayer. In fact, the only player that implements the aspect ratio instructions was Apple's QuickTime Player 5.0 and 6, as well as windvd and powerdvd and now some other media players from other companies.
This finding is more of an observation than a criticism, because the ability to scale MPEG-1 video to the proper aspect ratio quickly became a non-issue. That's because MPEG-1 encoders began including an option for "square pixel encoding" which produced an MPEG-1 file with a resolution of 320x240. In essence, rather than creating a 352x240 file and telling the decoder to shrink it to 320x240 before display to a computer screen, they shrink the file to 320x240 beforehand and encoded at the smaller resolution. The file on the right in Figure 1 was created using the square pixel option.
Note that this is exactly the same process that developers had been using for years when capturing NTSC video for compressing into AVI or QuickTime MOV format. You never, ever saw an AVI or MOV file at 352x240, it was always at 320x240, a 4:3 frame aspect ratio that looks correct on square pixel computer monitors. MPEG-1 encoder developers were simply adjusting to the reality that most users weren't creating VideoCD titles with their encoded files. Rather, they were creating files for CD-ROM titles, to play back from within PowerPoint or to post to a web site, in all cases for display on a computer monitor with square pixels.
What are the risks of creating MPEG-1 files in formats other than 352x240? Really, none. MPEG-1 is a flexible specification, so 320x240 files remained technically compliant with the standard, and of, course, played on all relevant software players.
There was a small risk of incompatibility with some MPEG-1 decoding chips designed to handle only Constrained Parameter MPEG-1 files, a much less flexible class of MPEG-1 files created to enable the creation of cheap, single purpose MPEG-1 hardware decoders. However, the number of such hardware decoders in the general computer population was so small that this risk was insignificant even back in 1998.
Accordingly, a clear development dichotomy emerged. When developing for VideoCD, encode at 352x240, or better yet, use a VideoCD preset that automatically creates a file with the proper parameters. When encoding for computer display, opt for the 320x240, square pixel output.
Of course, most developers were sufficiently knowledgeable to understand the difference, since few people other than professionals created MPEG-1 files back in the late 1990s. In addition, since most MPEG-1 encoding tools were professionally oriented, the square pixel option became nearly ubiquitous.
Users knowledgeable of the various aspect ratios could get good results, but unwary users could produce video that was technically compliant, but stretched in appearance when played back on computers. To help resolve this, and other issues discussed below, we contacted many developers of encoding tools with suggestions on how to change their presets and other user interface elements.
**NOTE: i reprinted portions of the explaination from, (and edited it to reflect some more current facts) (wriiten by By Jan Ozer): http://www.extremetech.com/article2/0,3973,1150665,00.asp"Each problem that I solved became a rule which served afterwards to solve other problems." - Rene Descartes (1596-1650) -
OK, so why are we here?
I don't have a bad attitude...
Life has a bad attitude! -
Originally Posted by leebo
The universe is shaped like big Cosmic Socker Ball ..
why we are here is because someone made a Illegal KICK .."Each problem that I solved became a rule which served afterwards to solve other problems." - Rene Descartes (1596-1650)
Similar Threads
-
Real-time analog to DV capture - what is the right device to use?
By Serge_KB in forum Capturing and VCRReplies: 16Last Post: 24th Apr 2012, 12:33 -
Real time video capture/display software?
By TheDunk in forum Capturing and VCRReplies: 3Last Post: 5th Jan 2012, 22:16 -
How to capture real-time video from my camcorder onto my PC
By seanhalley in forum Capturing and VCRReplies: 2Last Post: 2nd Aug 2011, 04:37 -
FlYDS capture REAL-TIME MPEG-2
By goal in forum Capturing and VCRReplies: 3Last Post: 27th Feb 2008, 04:08 -
Capture in Mpg Real Time-Best settings?
By themaster1 in forum Video ConversionReplies: 3Last Post: 21st Jun 2007, 21:01