Why is that people say that 720p is better than 1080i and when having the option to choose the first over the latter?
Since interlaced format is a frame split into single and double lines, does that mean that each frame is half the resolution?
ie (1920x1080)/2?
I'm just trying to figure out what the logic is here.
+ Reply to Thread
Results 1 to 16 of 16
-
-
With interlaced sources, only alternate horizontal lines are refreshed each cycle, using the persistence of vision of the human eye to fill in the blanks. Non-interlaced sources should give a more stable image, but at the end of the day, the preference is in the eye of the beholder; if you can set it up, try flicking between different sources on your TV and see which looks better.
Slainte
middersVolunteer for https://www.computersforkids.org.uk/ -
This should be a no brainer. 1080 lines of resolution versus 720, which do you think is better? That's right, 1080. Now if you wanted to argue 1080i over 1080p then yes 1080p would be the winner.
Here's some reading material: http://www.videotapestock.com/hd10vs10whdi.html
Also, just look at the frickin' picture. -
It should be a "no brainer" , but in reality it's not that simplistic.
Disregarding the content quality for now, and actual resolved detail, - the 1080i format doesn't resolve 1080 lines of resolution. Each field is actually 1920x540. Human perception is actually more sensitive to horizontal vs. vertial resolution, and broadcast engineers have come up with some equations and calculations using the "kell" factor (google it)
Resolution is only part of the equation for "quality". Interlacing has other issues - interlace flicker and deinterlace artifacts, jaggies. There is no such thing as perfect deinterlacing. Your display post processing will greatly influence quality. Higher quality sets will have better algorithms. Progressive sources avoid these issues -
It should be a "no brainer" , but in reality it's not that simplistic.
When I watch TV 1080i looks better than 720p sources.
FYI:
"For a given bandwidth and refresh rate, interlaced video can be used to provide a higher spatial resolution than progressive scan. For instance, 1920x1080 pixel resolution interlaced HDTV with a 60 Hz field rate (known as 1080i60) has a similar bandwidth to 1280x720 pixel progressive scan HDTV with a 60 Hz frame rate (720p60), but approximately twice the spatial resolution."
http://en.wikipedia.org/wiki/Interlace -
Also depends what you're watching it on. Upscale 720p to 768, or downscale 1080i to 768....
-
Yes it is that simple. Look at the picture. How does the picture look. It's about perception. What do your eyes see.
When I watch TV 1080i looks better than 720p sources.
You're neglecting other factors. For example, interlaced encoding is far less efficient than progressive encoding for any format.
What about source quality? Are all the sources from the same master? You're neglecting processing decisions made by the content providers and along the process chain
I agree it's about perception. For some people, on some types of content, in some circimstances, on some types of sets, 1080i60 may look better , for others 720p60. -
Use YOUR EYES!!!! That's all that should matter. The reason people saw clothes on the emperor was because they used other peoples opinion to over rule their common sense. I honestly can't tell the difference except on scrolling text on espn(which is 720p). But then again that's just my two cents.
Last edited by cracula; 18th Mar 2010 at 10:03.
I fly and YOU SUCK! -
Want my help? Ask here! (not via PM!)
FAQs: Best Blank Discs • Best TBCs • Best VCRs for capture • Restore VHS -
-
Not mentioned so far is the way motion is handled 1080i vs. 720p.
1280x720p has lower stationary resolution vs. 1920x1080i. For a still scene 1080i and 1080p show essentially the same picture at full 1920x1080 resolution. When an object is in motion, 720p delivers a full frame picture every 1/60th second (1/50th for "PAL"). Thus for fast action (e.g. think hockey with fast camera pan/zoom and fast object motion), there is no TV processing required to deliver a sequence of full resolution frames.
1080i delivers alternating odd and even lines (1920x540 resolution) every 1/60 sec for a full frame every ~1/30th second. It does this in approximately the same bandwidth or bit rate of 720p. When an object is in motion there is spatial displacement between odd and even lines of the same frame. For an interlace TV, this poses no problem since the TV is capable of sequential display of each field. The human eye processes the image into a 60 frame rate equivalent image through persistence of vision. The eye is tuned to resolve motion as vectors (direction) ignoring object detail until the motion slows. So for an interlace TV, 1080i is perceived by the human eye as near equivalent of 1080p at half the bandwidth. The Kell factor and other issues cause ~20% vertical resolution loss for interlace scan but since most TV aspect ratios are wider than high this isn't a serious tradeoff.
So if all HDTV sets were interlace (like a 1080i CRT or projector), 1080i would be close to a no brainer choice except for high action sports*. Remember, with high motion, the eye discounts resolution.
The spoiler for 1080i is the trend to progressive scan display technology. Progressive scan can't display 1080i as a sequence of 1/60th second fields. Unprocessed 1080i is displayed as 1/30th sec frames with horizontal combed lines during motion. The human eye can't process this image. Line split during motion is not perceived as directional motion but as un-natural artifacting (horizontally biased). For this reason, 1080i must be processed by the TV (deinterlace or inverse telecine) to make the image acceptable and predictable to the human eye.
Early HDTV sets had poor image processing. They mainly relied on blend deinterlace to average or blur out 1080i line comb during motion. This made the image look motion laggy/blurred. Over time inverse telecine and motion adaptive bob deinterlace techniques were added to better track motion vectors. Deinterlacers are getting better each model year with advancements showing first in the high end models. Budget models use 2-3 yr old "hand me down" deinterlace technology.
So, if you buy a cheaper flat screen TV, 720p may deliver a more motion natural image even if resolution is upscaled. A higher end HDTV will better handle 1080i source with fewer artifacts.
* progressive also has advantage for "white paper" text and graphics display. A white background tends to flicker more with 60Hz interlace scan.Last edited by edDV; 18th Mar 2010 at 11:45.
Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
I should expand on the special case of 24p film source. This includes movies and most TV series intended for international distribution*. In the "NTSC" areas 24p source is sent telecined to 1080i/29.97 or 3:2 frame repeated to 720p/59.94. A modern HDTV processor can "smart" inverse telecine 1080i to 1080p/23.976 and then frame repeat or frame interpolate to display frame rate. For 720p, 60 Hz HDTV sets display directly. 120/240Hz HDTV sets remove repeat frames, then frame repeat or frame interpolate to 120/240 fps.
So, if the HDTV has a good "smart" inverse telecine processor, 1080i will produce "full" 1920x1080 resolution for film source.
720p has the advantage of ~half the bit rate requirement vs 1080i for 24p source. A 720p broadcaster can use this advantage to add more secondary subchannels (e.g. PBS) or they can reduce compression for a higher quality 1280x720 image (e.g. many ABC/FOX local stations).
*24p is still used for TV series production mainly because it allows simple, direct conversion to 23.976, 25, 29.97, 48, 50, 59.94, 72, 96, 100, 118.88, etc. frame rates for international and multi-format distribution.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
On a certain level, yes, it certainly is.
Put two TV's side by side...... one with an image in 720p vs another one with 1080i. The 1080i looks much better. Period.
If you desire to see something bad enough then you will be able to find the trees for the forest(or however the analogy goes).
But to the typical person...... 1080i > 720p> 480. Some things really are just that simple.
TCMy Dell PC system info.....3.4 Ghz Quad Core i7 processor....... 12 gigs of ram DDR3...... Windows 7 ultimate 64 bit.......video card Nvidia GTX 650 -
I have.
Of course, it also comes down to personal preference. There are pros/cons to both formats. Motion is where the biggest difference can be seen
I'm trying to find a study, there was a recent study done by the BBC and EU and something like 60% preferred 720p60 over the 1080i60 , I'll try to find the link
Sometimes it's better, sometimes it's worse. It depends on many factors, like source, processsing, display quality.
Just because the number is bigger, doesn't mean it's better
Cheers -
The common reasoning I've seen in the past is that 1080i is better for still shots while 720p is better for motion. I think this is a fair enough assessment of the difference. It also depends on whether the camera's sensor captured enough fine detail to justify the 1080 lines of resolution.
-
Use your eyes -- pretty good advice if it weren't for the fact that most consumers will watch anything that moves, regardless of how awful it is.
My first HDTV was a 1080 50" plasma. It was so big that no one could sit less than 12 feet away. Cable broadcasts and most BluRay's were of such poor quality on this $5000 set that I returned it and brought home another brand of the same type and size. Forget it. SD-DVD's look like crap blown up to 1080, and so did anything else that was in 720 or 420 format (which is 80% of what's out there). The slightest fault in any source, even if the source was 1080 lines, was blown out of proportion and was a constant annoyance.
It all went back and was replaced by a 720p 42". Voila! Everything looked so much better, I kept the thing. OTA looked better, too, as it was a quality 42" that didn't have to work very hard to upsample poorer stuff or downsample 1080 lines. On top of that, I seldom had anyone in my house who could tell the difference on a 42" set between 1080 and 720. I see this happening in the bigbox stores: time and again, I'll see a customer staring at a group of huge HDTVs and finally asking the salesperson, "Uh, can ou tell me which one of these tv's has 1080 lines?"
I had a sometime visitor who insisted that his 1080 at home had more lines than my 720, so his was clearly superior. After a few weeks of listening to this hogwash, I took up his invitation to visit his home and see his beautiful 1080 plasma. I took one look at it: I have seldom seen such ugly colors in my life, thanks to the juiced-up settings he used, and that alone was responsible for red and blue color bleed that made small and distant objects undecipherable. His sharpness control was set at max, which made the image so grainy it often looked like VHS. But it was evident straight-on that his 42" was a 720-line Panasonic. When he didn't believe me, I took out his user manual and read the specs: 720p. Period. Guess what? Now he's shoppiong for a new tv, even though he's still one of those customers who can't see the difference between 720 amd 1080 in a store. He refuses to accept the fact that a well-engineered 720 will look better than a mediocre 1080 at sensible screen sizes.
Consider one more thought from the graphics and movie world: many factors determine the "quality" of an image, even if the image is black and white. Among those factors are contrast range, color accuracy, color temperature, color density, absence of noise and distortion, the ability to display blacks that are really close to true black, and image resolution. I recall one acquaintance who worked in the commercial image processing industry; he complained, "What's all this crap about 'sharpness'? Sharpness isn't a problem. What you really want is acutance, of which sharpness is only one of many factors." Most graphics pros will tell you that resolution is indeed important. But on the list of factors just mentioned, resolution would not have highest priority.Last edited by sanlyn; 20th Mar 2014 at 16:15.
Similar Threads
-
Upscaling performance to 1080i/720p
By alexh110 in forum DVD & Blu-ray PlayersReplies: 5Last Post: 17th Dec 2011, 17:02 -
Encoding 1080i@25FPS to 720p
By jiopi in forum Video ConversionReplies: 24Last Post: 9th Nov 2010, 00:58 -
720p vs 1080i - which is best?
By snadge in forum DVB / IPTVReplies: 19Last Post: 31st Aug 2009, 18:19 -
Confused: 1080i vs 720p
By kouras in forum Video ConversionReplies: 15Last Post: 19th Feb 2009, 06:20 -
I want a new tv. should i get a 1080i or a 720p
By championjosh in forum Off topicReplies: 8Last Post: 1st Jan 2008, 12:34