I notice STB's like Freesat-HD and SKY+HD have the option to use 720p or 1080i...
what is the point of having this? , because:-
if the broadcasts are all in 1080i (which all mine seem to be in on SKY+HD) then why have the STB on 720p..?
this just means the STB is De-Interlacing and downscaling the 1080i to 720p for your TV to either
A) re-scale it to 768p (768x1366 - most smaller LCDs)
B) re-scale it to 1080p (1080x1920 - most larger LCDs)
wont it be better in BOTH cases just to have the TV recieve the 1080i HD signal and rescale as necessary?
1080i (broadcast) to 720p (STB) to 768p/1080p (TV) is just too much work and therefore probably less quality
but
1080i (broadcast + STB) to 768p/1080p (TV) sounds a bit better - less up/down processing therefore probably BETTER quality
I can understand that in RARE circumstances a broadcast MAY be in 720p and you MAY have a Plasmas TV that has an actual 720x1280 resolution (though these 'plamas' arent really common) and therefore setting the STB on 720p for the 720p broadcast would be beneficial (no scaling but direct through to the panel at 720x1280) - but like I say the more common TV's are 768p and 1080p...not 720p - and MOST broadcasts are in 1080i (as to accommodate larger resolution so the image is still sharp on a 50" set as it is on a 32" set)
I know the difference between Progressive & Interlacing but the point is if summit is broadcast in INTERLACED then even if its de-interlaced into PROGRESSIVE artifacts still come through and therefore loss of quality -
I think when it comes to broadcasting 720p should be rid of altogether cos its ONLY any good if your set actually has 720 lines - otherwise its been upscaled - dunno about anyone else but when I watch HD..I wanna watch HD, not UPSCSALED HD (to 768[ or 1080p)
I reckon 1080i is the better for alrounder - for e.g. 1080 lines downscaled to 768 lines (broadcast-to-768p sets) looks just as sharp/clear as no noise is introduced from any 'upscaling' processes - BUT, 720 lines upscaled to 1080 lines (broadcast-to-1080p sets) DOESNT look as good as its 'upscaled' and therefore has artifacts from upscaling etc...
+ Reply to Thread
Results 1 to 11 of 11
-
-
You have made a case why the 1080i selection is good for you but 720p is the better choice for others. Maybe Freesat-HD and SKY+HD are mostly broadcasting 1080i now but 720p origination is part of the standard. In those cases 720p gives 50 full frames per second which is great for sports, dance or any high motion programming. A 1080i conversion from a 720p broadcast sees a resolution reduction especially in stop frame and user stored slow motion..
720p out of the set top box would be better for
1. Anyone that wants to view the motion advantages of a 720p/50 broadcast.
2. Those without a deinterlacer (e.g. watching from a computer monitor or some projectors).
3. Those with cheap progressive TV sets that have poor deinterlacers (e.g. blend, or blur vs. motion adaptive).
4. For NTSC market, those TV's that lack or have poor inverse telecine. In that case, the set top box does the inverse telecine to progressive.
There should be a third setting called "native" where 1080i broadcasts are sent 1080i and 720p broadcasts are sent 720p. The HDTV would adapt for best picture.
IMO the switch from 1080i to 720p should be program by program. New progressive TV sets could handle this, older ones would glitch at the transition. So having three settings gives an optimal setting for all TV sets in the field.
Over here we have sports targeted networks (ESPN, ABC and FOX) in 720p/59.94 all the time. Also National Geographic on cable is broadcast 720p/59.94.
It should also be noted for cable application two 720p/24 movies could be sent at the same bit rate as one 1080p/24 movie. These can still be output as 1080i/25, 720p/50 or "Native" from the set top box.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
yeah I agree with program-to-program , the SKY+HD box has automatic and changes between 576 (SD) and 1080i (HD) channels - ive been on all the HD channels (apart from sport as i dont have it) and they all went on 1080i on automatic
my natgeo went on 1080i.... - but that could be that the SKY+HD box is just using 1080i for ALL HD broadcasts....
my TV is a cheapo BUSH 32" 768x1366 with CRAP upscaler , this is why i prefer to use 1080i than 720p - I would rather have a 'shrunk' 1080 image than an 'upscaled' 720 image - but even as none of my SKY channels 'seem' to be in 720p anyway then its pointless in having it on 720p at all
my other argument (confusion lol) is WHY DO THEY HAVE TV'S AT 768X1366 WHEN THE HD RES IS 720x1280? - if TV's were 720x1280 instead of 768x1366 there would be no upscaling process and therefore you would get pixel-for-pixel replication on screen - upscaling adds in noise/artifacts - especially if the upscaler is poor...like mine...
just seems an ODD resolution to have on a HDready TV...?
HD = 720x1280 & 1080x1920 - where does 768x1366 come from? -
Originally Posted by snadge
The problem is worse than you think because even 1080i is upscaled for 1080p native screens to accommodate overscan. Solution is to improve scaler performance since most all sources get upscaled.
http://en.wikipedia.org/wiki/OverscanRecommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by snadge
As for the odd 1366x768 resolution it probably has to do with the size of the glass plates they make the displays on and the sizes they can be cut into. And probably some "spec inflation". Ie, many people confronted with a choice of 1366x768 vs 1280x720 would choose the larger one. -
Originally Posted by jagaboRecommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
It has to do with the darn OVERSCAN.....
Give the industry a kick to force 1080p for sat!!!*** Now that you have read me, do some other things. *** -
Originally Posted by [_chef_
Overscan requires the raster to be oversized so that the center 1366x768 can be displayed.
Originally Posted by [_chef_
As for resolution you are up against compression artifacts if you increase transmitted resolution too far. 1440x1080 and 1280x720 allow better compression efficiency and a better overall picture quality. The set top box can convert those to 1920x1080i or 1920x1080p for output.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Isn't it for backward compatibility? Early HD Ready TVs could only manage 720p, later ones progressed to 1080i, current decent ones can do 1080p. If an STB only output 1080i, anyone with a TV only capable of 720p wouldn't be able to resolve it. Also, I've noticed that certain programmes, usually those with lots of motion, look marginally better at 720p than 1080i due to the effective doubling of the frame rate.
-
Originally Posted by Richard_G
Early HD CRT TV sets and projectors could only do 1080i. 1080i is sequential 540 line fields so is much easier to do than 720 line progressive frames at the same rate. 1080 line progressive is supported in the ATSC standard but only at 23.976 frame rate. 1080p @25fps is a local option for DVB.
1080p @24fps is also a common Blu-Ray standard but few HDTV sets can connect at 24fps so 1080i/29.97 or 1080i/25 connections are more commonly used.
Note that we are talking about the connection between the tuner/STB and the TV display above. Display resolutions and frame rates can differ from 1080i or 720p. HDTV displays and be any native resolution and display frame rates can be processed for 50/60, 72 or 100/120 refresh rates.
ATSC and DVB usually broadcast at full square pixel 1920x1080 or 1280x720 resolutions. The industry "secret" is the transmission path to the transmitter is often not full resolution as broadcast. Also, cable or satellite transmission can be arbitrary resolution (e.g. 1440x1080,1280x1080, 960x720). The signal gets converted to 1920x1080i or 1280x720p in the STB. Picture quality is a trade off of both resolution and bit rate. You can pack more channels into a transponder by lowering bit rate per channel. Given a lower bit rate, you can improve picture quality (less artifacts) by also lowering resolution. The other way to improve picture quality at a given low bit rate is to use a proprietary codec (e.g. DirectTV and Dish).Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about
Similar Threads
-
What settings will give me the best picture when rendered at 720p?
By ItsJustAGuy in forum Newbie / General discussionsReplies: 10Last Post: 10th Feb 2011, 08:21 -
720p vs 1080i?
By therock003 in forum Newbie / General discussionsReplies: 15Last Post: 27th Apr 2010, 21:52 -
720p vs 1080i - which is best?
By snadge in forum DVB / IPTVReplies: 19Last Post: 31st Aug 2009, 18:19 -
Confused: 1080i vs 720p
By kouras in forum Video ConversionReplies: 15Last Post: 19th Feb 2009, 06:20 -
I want a new tv. should i get a 1080i or a 720p
By championjosh in forum Off topicReplies: 8Last Post: 1st Jan 2008, 12:34