I have just got myself a Toshiba HDMI DVD player that upscales to 720p and 1080i
I have to say that the quality is excellent but looks exactly the same on 720p or 1080i... why is this?
I understand that 1080i is 2 lots of 540 lines interlaced so in theory its displaying 541 lines compared to 720p 720 lines so I would have thought 720 progressive would be the better quality...
Im still unclear on how Interlacing works...
I dont think my TV is downscaling it to 720p cos the built in scaler in the TV isnt very good and I reckon it would be a poorer image than the 720p feed if it DID downscale 1080i to 720p
can anyone clarify which is best format to watch and why?
if my TV is 1366 x 768 resolution how does it display 1080x1920i ?
+ Reply to Thread
Results 1 to 20 of 20
-
-
Originally Posted by snadge
Also, a TV's scaler suffers most when upscaling, and in this case the DVD player is upscaling and the TV is downscaling.
Long story short, if your TV is 720p set your DVD player to 720p. And neither formate is "better", they both have their purposes. -
If your TV has a resolution is 1366x768 then I would output 1080i,the reason is your TV only has to scale down instead of scale up.
You might want to get a calibration DVD such as DVE to make sure. -
http://www.youtube.com/watch?v=Z-JXfyvlPh0
I agree with that - its what ive always thought -
so my TV doesnt display incoming feeds on the HDMI connectiopn at the resolution its supposed to?
if i feed 1080i my TV displays 1920x1080i (on screen info)
if i feed 720p my TV displays 1280x720p (onscreen info)
if i feed 576p my TV displays 576p (onscreen info)
Ive heard this before about TVs rescaling 1080i to 720p but i dunno cos if thats the case why say the TV displays at 1080i (which is 540p) if its just gunna scale it to 720 (which isnt 540p or 1080i)
1080i is supposed to display 540 progressive lines...now my TV supports 1080i as does the DVD player so if im on that resolution i should be getting a display of 540 lines... BUT i have to say it doesnt look like it does - the pic quality is same as 720p - AND if 1080i(540p) is upscaled to 720p the quality would probably be less...??
Im totally confused -
Geez, ignore that YouTube nonsense because you don't have an interlaced set. Everything you see on your TV set is shown at progressive 1366x768. If you're having the DVD player output 1080i, then it gets deinterlaced and downscaled to 768p. If you have your player output 720p, then it gets upscaled to 768p. One way won't necessarily look better than the other. You'll have to judge which you prefer. It could very well be, as you stated, that they both look equally good. You might also throw 576p (and 576i if you turn off the progressive scan) into the mix to see how they look.
I'm just going by the resolution you stated (1366x768). I assume you know the resolution of your TV set.
if i feed 1080i my TV displays 1920x1080i (on screen info)
if i feed 720p my TV displays 1280x720p (onscreen info)
if i feed 576p my TV displays 576p (onscreen info)
now my TV supports 1080i -
There's no simple answer as to which is better.
Any decent HDTV will inverse telecine any 1080i film source back to 1920x1080 progressive frames, then display it at the TV's native resolution. A good 1080i film source will look sharper on a 1080p HDTV than on a 720p HDTV (assuming you are sitting close enough to the TV to see the difference).
Fully interlaced 1080i (live video sources like sports and the news) have to be deinterlaced in some fashion. Smart deinterlacers will maintain the full resolution in non-moving areas but will fall back to half (vertically) resolution in moving areas. This is one reason why much sports (ESPN for example) is broadcast at 720p.
All the processing going on makes it impossible to predict what will look better on an HDTV. You have to try both options with different sources and you have to know what defects to look for.
Oh, and upscaling standard definition DVD doesn't make it high definition. About the best you can hope for is for it not to introduce too many scaling artifacts. (There are "super resolution" scaling techniques but I don't think any upscaling DVD player does this.) -
fair enough BUT
its my TV thats SAY (OnScreen) its displaying 1080x1920i etc etc...
now if it means it just ACCEPTS an interaced signal and interlacing is for old CRT sets with NO HDMI ports then WHY are manufacturers creating DVD players that will display/output 1080i on HDMI ONLY????
its as if 1080i/720p etc is aimed at LCD/PLASMA screens....
If Interlacing is for old sets surely these new DVD players and LCD/PLASMA sets wouldnt even entertain the signal/numbers AND you can only get em on HDMI??? (old sets work on standard ariel cables or scart) -
I'm a n00b at this HDTV business having finally bit the bullet in an effort to make my system compatible with my AppleTV. I bought a relatively inexpensive 32" that is 1366x768 "native". I get my TV signal over a roof-top antenna and am quite pleased with the results. Frankly I can see the zits under the actresses' makeup so any higher resolution would be a waste of $$. Of course, none of the stations in my area is b'casting higher than 720p so there's another reason. Additionally, the AppleTV's 720p movies are exquisite. Finally, anything I have (or get) that's less than 720p looks pretty good on the TV.
Based upon all the discussions, however, it seems I must check out an upconverting DVD player as most owners do sing their praises.
By the way: BluRay? Nahhh; they may have won the battle but they will lose the war against digital downloads. -
Originally Posted by snadge
If Interlacing is for old sets
* I suspect the limitation with upscaled SD sources is simply a byproduct of the chips that are use. My guess is the chain looks like this:
HD source -> decoded to HD frame buffer -> HD frame buffer output to HDMI only
SD source -> SD frame buffer -> SD frame buffer output to HDMI or analog
SD source -> upscaled to HD frame buffer -> HD frame buffer output to HDMI only
The chip that outputs the frame buffer only outputs to HDMI when there is an HD frame. In the case of upscaled SD it doesn't know that the source is really SD. -
so modern TV's that take Interlacing is for analogue videos is that what your saying?
Im just confused as to why they bother making HDMI DVD Players with a 1080i resolution IF it doesnt actually get used????????? -
1080i can be used to transmit the picture from the player to the TV. A progressive HDTV will inverse telecine or deinterlace to display progressive frames. 1080p transmission is new. The first few generations of HDTV didn't support it.
Why do HD and upscaling players output both 720p and 1080i (and some 1080p)? Because some HDTVs do better with one than with the other(s). -
Check out this overview on interlace. It might be a helpfull introduction.
https://forum.videohelp.com/topic346546.html
The fellow on YouTube ignors the processing that goes on in a progressive plasma/LCD-TV and is totally wrong about his 540p arguement even for CRT and RPTV sets. I suggest he looks up "Kell factor" to understand the perceived vertical resolution of an interlace CRT which is ~0.7 x 1080 lines or around 760 assuming the CRT dot pitch resolves that fine.
http://en.wikipedia.org/wiki/Kell_factor
A progressive television has a "native resolution" defined by the fixed horizontal and vertical pixels on the display. Today, plasmas are most often 1024x768 or 1366x768 with some expensive models at 1920x1080. LCD displays are usually 1366x768 or 1920x1080. Everything coming into a plasma or LCD-TV is converted to native resolution by the TV's processing engine. Possible inputs to a progressive HDTV include
1. Analog NTSC or PAL
Fields get decoded to YPbPr components, then are digitized and deinterlaced to 480p or 576p, then are scaled to native resolution for display.
2a. SD 480i/576i YPbPr
These fields get digitized and deinterlaced to 480p or 576p, then are scaled to native resolution for display.
2b. SD 480i/576i DVI/HDMI
Already digital, these fields get deinterlaced to 480p or 576p, then are scaled to native resolution for display.
3a. HD 1080i YPbPr
These fields get digitized and deinterlaced to 1080p, then are scaled to native resolution for display. Upscaling is even done for native 1080p HDTV sets due to "overscan" issues.
http://en.wikipedia.org/wiki/Overscan
3b. HD 1080i DVI/HDMI
Already digital, these fields get deinterlaced to 1080p and then are scaled to native resolution for display (see overscan).
Note: All of the above inputs are interlace (either 50 or 60 fields per second) so must be deinterlaced to 50 or 60 frames per second for progressive display. The quality of the deinterlacer is what separates low end from high end processing engines. Without going into the details of deinterlace, a genenral rule is processor quality will be better for newer and higher end progressive HDTV sets (e.g. the Sony XBR engine is better than the Bravia engine which will be better than the more generic Olevia).
4a. 480p/576p/720p/1080p over VGA or YPbPr
These analog progressive formats are usually input at 50 or 60 frames per second. They are digitized and then scaled to native resolution.
4b. 480p/576p/720p/1080p over DVI or HDMI
These inputs are already digital, so get scaled to native resolution.
So, the main jobs for the processing engine are deinterlace and scaling. Newer sets add a third feature which is frame rate upconversion to 100Hz or 120Hz to improve motion smoothness and reduce flicker.
A good processing engine will eliminate most differences between 1080i and 1080p. For film source (e.g. TV series and DVD movies) 1080p can be extracted from 1080i with no loss. For sports action, 720p gives sharper action clarity over 1080i but a good processor will extract reasonably good progressive frames from 1080i fields.
Did this help?Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by snadge
However, many people own older or cheaper HDTV sets with marginal or crappy processing engines. For those HDTV sets, you can often get a better picture feeding directly at close to native resolution.
For many CRT and RPTV sets that are natively 1080i, a DVD player feed at 1080i upscale may get a better picture than feeding 480i or 480p. One needs to experiment.
For older or cheaper plasma or LCD-TV sets, a 720p DVI/HDMI feed bypasses most of the TV's existing processing except for the final scale to native resolution. The upscaling DVD player may have a superior engine to that in the TV and may get better results.
For a newer 1080p TV with average processing engine, a premium upscaling DVD player may have a superior deinterlacer/scaler. In that case a 1080p HDMI feed to the TV bypasses the TV engine entirely.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
thanks edDV - that just about sums it up
I now believe the De-interlacing to native format theory and that YouTube video is wrong....
My HDTV is a cheapo BUSH 32" 1366x768 600:1 contrast ratio
and the upscale isnt as good as my Toshiba HDMI DivX DVD Player that ive just bought...
1080i and 720p look about the same BUT im probably better letting the TV reformat 1080i to 768p than 720p to 768p cos my TV will have to upscale 720p to 768p and that brings the TV's upscaling into play whereas it probs does a better Job at Downscaling a 1920x1080i video to 768p
they both look the same but at times im sure 1080i looks slightly better...
thanks again edDV -
Originally Posted by jamespoo
-
this is a perfect example of how the thread isn't really read...just a post in reaction to the title...regardless if it's fact, fiction, or shades of grey
it's little 12 year old Bobby again...who snuck out of bed, turned on his mother's computer, and started typing nasty stuff...chuckling all the time...until his mommy heard something...and he was chased back to bed...12 years old is the mentality on the web -
seaching through all posts and quickly leaving few word replys hehe
anyway the answer is wrong - (assuming hes on about broadcasts and not DVD player outputs) you cant assume that 1080i is upscaled all the time AND you dont know what SKY are broadcasting at anytime as its always 1080i - whether the broadcast is 1080i true or 720p upscaled to 1080i (by SKY) or 720p upscaled to 1080i (by the SKY+HD Box) you will NEVER know - all we know is that the sky box always outputs at what you set it at - it doesnt mean they broadcast 720p to you....
and even still 720p is also upscaled to 806p (768p +5% overscan) on most sets so still some upscaling by the TV - unless you have a 720x1280 set that has a 'just-scan' 1:1 mapping feature
so no, 720p isnt the best to have it on - especially if you have a 1080p set -
So many variables
Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about
Similar Threads
-
Upscaling performance to 1080i/720p
By alexh110 in forum DVD & Blu-ray PlayersReplies: 5Last Post: 17th Dec 2011, 17:02 -
Encoding 1080i@25FPS to 720p
By jiopi in forum Video ConversionReplies: 24Last Post: 9th Nov 2010, 00:58 -
720p vs 1080i?
By therock003 in forum Newbie / General discussionsReplies: 15Last Post: 27th Apr 2010, 21:52 -
Confused: 1080i vs 720p
By kouras in forum Video ConversionReplies: 15Last Post: 19th Feb 2009, 06:20 -
I want a new tv. should i get a 1080i or a 720p
By championjosh in forum Off topicReplies: 8Last Post: 1st Jan 2008, 12:34