I have a LCD flat pannel 1080p display and a Directv receiver that outputs HD in 1080i only. So when watching ABC, FOX or ESPN I'm getting 30fps.
Are there receivers (Cable, Satellite, ATSC) that switch their output, 720p or 1080i automatically depending on the broadcast?
Do HDTV sets with built-in tuners display the proper frame rate?
+ Reply to Thread
Results 1 to 5 of 5
-
-
Hi-
So when watching ABC, FOX or ESPN I'm getting 30fps.
Technically, you're getting 60fps (59.94fps). You're saying the TV set is just deinterlacing, rather than bobbing, so that each 30fps frame is shown twice, rather than each 60fps frame being shown once? I have a 720p HDTV but no Hi-Def sources, so I'm just curious. I have plenty of interlaced 30fps sources on DVD, but they play so smoothly on the HDTV that I thought they were being bobbed by the Faroudja chipset. Maybe it's dependent on how your TV set handles interlaced sources. I bet edDV would know.
Of course, if your 1080i source is film (as opposed to, say, some ESPN sports broadcast), I would expect it to be returned to 24fps progressive and shown as 3 2 3 2 3 2 (60fps), where the numbers represent how many times each consecutive frame is shown. -
Originally Posted by Megahurts
1080i is 60 fields per second (59.94 to be exact), 2 fields make a frame so 1080i is said to be 29.97 frames per second. This is a bit misleading because motion and screen refresh is at 59.94 Hz.
To get from 1080i to 1080p, your TV has to process the image relying mainly on what is called a bob. A bob interpolates a full frame from a field using various estimation techniques. These usually force a weave in low motion picture areas and bob or other technique when dealing with motion. This results in somewhat blurry motion when viewed progressive.
720p is broadcast by ABC, Fox, ESPN at 59.94 full frames per second hence the wonderful motion detail for sports. If you were getting 720p over the air, your TV would have a fairly simple task to get to 1080p. All it needs to do is upscale 1280x720 to 1920x1080 for each frame.
DBS and cable boxes can muck up the advantages of 720p. Many systems don't carry native 720p* but instead convert everything to 1080i. They would upscale to 1080p as described above and then send only one field per frame to make 1080i (half the picture information). Your 1080p HDTV would then have to process the image back to 1080p by guessing what is on the missing field.
If the box is doing as you suggest, it would be tossing every other 720p progressive frame, upscale to 1080p/30 and then create two fields off the same frame. That would result in jerky motion.
* cable boxes often allow local station direct 720p over the IEEE-1394 port even if the cable box is converting it to 1080i for the YPbPr and HDMI out.
Originally Posted by Megahurts
Originally Posted by Megahurts
All of the above assumes live TV camera/VTR source. Film issues differ. -
edDV,
When a 1080i source is input to the display, the horizontal frequency reports 33.8khz which is 60 fields per sec. If it were 60 frames per sec the horiz freq would be 67.6khz. A 720p source reports 44khz.
Since the HD receiver has to upscale and interlace 720p to 1080i then the display has to deinterlace and scale (non 1080p displays) to its native resolution. If a receiver output a 720p signal then the display would only have to scale to its native res. I do get ABC and FOX over the air but with this receiver I'm limitted to 30fps.
I believe most people that have the option, set their receivers to output 1080i thus not viewing the full benifit of 720p for crisp motion.
Im I missing something besides frames? -
Originally Posted by Megahurts
All I know is I can get true 720p over air or locals over IEEE-1394 with Comcast. That means Comcast is passing local ABC and FOX to the home as 720p/59.94. I have no way to verify if the Motorola 6200 or 6412 passes true 720p/59.94 or not.
The TV set will vary in the way it processes 720p. The high end will preserve 59.94 motion. Low end sets will take any short cut to say they are 1080p.
Two years ago many so called "1080i" projection or CRT sets accepted 1920x1080i/29.97 input but dropped a field to 540p/29.97 and processed horizontal at a fraction of 1920 (960x540 to 1440x540). Industry rules seemed to say anything more than SD/DVD 720x480 was "HD-Ready".
Current TV manufacturers are taking processing shortcuts to say they are 1080p. One should ask how 480i, 480p, 720p and 1080i get converted to 1080p? Beyond that one should ask how 1080p/24 HD/BD DVD output gets converted to 1080p display? In many cases it would seem like the reality of sausage production.
They will tell you it makes no difference to most folks in their focus group and you really don't want to know the details.
Similar Threads
-
How do I convert 60fps to 30fps to halve file size?
By DigitalOxygen.ca in forum MacReplies: 33Last Post: 11th Jul 2012, 21:27 -
Encoding 30fps from 60fps source, keep maximum smoothness?
By squall0833 in forum Video ConversionReplies: 22Last Post: 12th Apr 2012, 23:57 -
Shoot 60fps or 30fps for YouTube?
By vid83 in forum Video Streaming DownloadingReplies: 5Last Post: 18th Jun 2011, 12:42 -
Converting a 60fps .MTS video to a 30fps raw .AVI?
By Anon1 in forum Newbie / General discussionsReplies: 1Last Post: 20th Jun 2010, 15:57 -
Converting 720p/60fps file to 30fps?
By mt123 in forum Newbie / General discussionsReplies: 11Last Post: 17th Nov 2009, 12:34