If your TV is 1080p, then no matter the input (1080i in this case), what's output by the television has to be 1080p.
+ Reply to Thread
Results 31 to 32 of 32
Originally Posted by stanjschr
Here is what your Bravia Image processor must do...
Straight 1080i/29.97 fps interlace video (e.g. news, sports, reality or TV variety shows) will be deinterlaced to 1080p/59.94 fps for display. If your HDTV displays at "120 Hz" then each frame is repeated twice.
Movies or TV drama shows are shot 1080p/23.976 fps and telecined to 1080i/29.97 fps. The Bravia processor first detects that the scene is a telecine (aka "Cinema") source and then applies inverse telecine to remove redundant fields leaving a 1080p/23.976 fps raw film rate. Then the Bravia processor repeats frames 3x 2x 3x 2x to display at 59.94 fps. If the HDTV is "120Hz", then the 1080p/23.976 frames are repeated 5x or interpolated to 119.88 Hz for display.
The Bravia processor is capable of switching from deinterlace to inverse telecine on the fly and also has modes to deal with broken telecine cadence. Deinterlace is also adaptive to the types of objects and motion in the image. Often weave, bob and single field deinterlace are being applied to different pixel blocks of the same frame.
This may help you understand the image processor's role for ultimate picture quality. It is more important that you get a top end processor (e.g. Pioneer Kuro or Samsung 750 or Sony XBR7) than 1080p. The Kuro @ 768p is fine for normal viewing distances. If you get a bargain generic processor and a 1080p display panel, that means you will be seeing deinterlace artifacts and cadence detection errors in sharp resolution.Recommends: Kiva.org - Loans that change lives.