VideoHelp Forum
+ Reply to Thread
Results 1 to 15 of 15
Thread
  1. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    yeah, it is i again, here are the questions:

    1) assume you have an HDTV that can display 480i, 480p, 720i, 720p and 1080i, here's what some will consider a really stupid question: why can't the tv display 1080p?

    i know on the surface it appears to be a retarded question but i can't seem to find any technical reason why it would be impossible, if the tv can display both interlaced and progressive encoded content and it can display 1920x1080 interlaced then why shouldn't be able to display those same 1920x1080 pixels when they are encoded using progressive scan, it's still the same number of pixels? is it just a marketing limitation to get people to buy the more expensive 1080p sets?

    2) if you have a video, say a HDTV capture that's 1920x1080i and for whatever reason you wanted it downscaled what would be the best downscaled resolution to convert to? in other words i have seen people claim that 1920x1080i is the equivalent of 960x540p, a completely absurd claim that i know is wrong yet i keep seeing it around the net and despite knowing that it's a stupid claim i can't seem to stop thinking about it every time i think about downscaling 1080i content primarily because if there is even a kernel of truth in the notion then it's ridiculous to try and convert from 1080i to 720p because then you are effectively upscaling the video, so the target should be 960x540p or lower.

    3) assuming i am correct and 1920x1080i is not the same as 960x540p, then what would be the best de-interlacing method to go from 1080i to 1080p or 720p (the 1080i source shows horribly exaggerated interlacing effects when i try and play it back on my computer).

    4) last but not least, what is the unit that pixels are measured in? in other words when we say we have a pixel aspect ratio of 1:1, 4:3, 16:9, what we really are saying is that the pixels are 16 units wide and 9 units tall, but what exactly is that unit? if we had a 1440x1080 4:3 video and a 720x480 4:3 pixel, would the pixels of the first file be the same size as the pixels of the second file or do they just have the same ratio? i guess fundamentally what i'm asking is when we resize a video, without cropping, are we actually resizing the pixels along with discarding some pixels or are we just discarding pixels?

    thank you, come again.
    Quote Quote  
  2. Originally Posted by deadrats
    1) assume you have an HDTV that can display 480i, 480p, 720i, 720p and 1080i...
    Assume no such thing. It'll display its native resolution, whatever that might be, and only that resolution. It may accept those other resolutions as input, but they'll be converted one way or another to its native resolution. Maybe the native resolution is 1080p. Check your manual, or give us the TV make and model number.
    Quote Quote  
  3. 2) if you have a video, say a HDTV capture that's 1920x1080i and for whatever reason you wanted it downscaled what would be the best downscaled resolution to convert to? in other words i have seen people claim that 1920x1080i is the equivalent of 960x540p, a completely absurd claim that i know is wrong yet i keep seeing it around the net and despite knowing that it's a stupid claim i can't seem to stop thinking about it every time i think about downscaling 1080i content primarily because if there is even a kernel of truth in the notion then it's ridiculous to try and convert from 1080i to 720p because then you are effectively upscaling the video, so the target should be 960x540p or lower.
    1920x1080i30 is 30 frames per second, but 60 fields that are 1920w x 540h per second. Each field captures a different moment in time. Note the horizontal resolution is the same (ie. the 1920 wide) as 1920p30. The temporal resolution is double that of 1920p30 (you are seeing 60 moments per second, albeit in fields of 1920x540 not full frames of 1920x1080). When you double rate deinterlace (or bob), you get 1920p60 or 60 frames per second. These are constructed from interpolating the fields, so the bobbed 1920p60 is not the same as a camera that shot native 1920p60.

    The best resolution is the same resolution, and hopefully is the one that your TV/monitor supports natively. Displays work best at their native resolution. Any other scaling up or down will degrade the image (some hardware scalers are worse than others)

    3) assuming i am correct and 1920x1080i is not the same as 960x540p, then what would be the best de-interlacing method to go from 1080i to 1080p or 720p (the 1080i source shows horribly exaggerated interlacing effects when i try and play it back on my computer).
    Yes you are correct. But deinterlacing degrades the image, most hardware like TV's blu-ray players should handle interlaced content fine. You haven't mentioned how you are watching this or your hardware setup

    If this is for PC display, you can use a better deinterlacer, but the good ones are too slow for real time playback (ie. instead of running it though ffdshow or through an avs script for real time playback, you usually have use an .avs script to re-encode a new file). You can experiment with a few deinterlacing modes in VLC for example, just toggle through them. But they all pale in comparison to the heavy duty avisynth methods. The quality difference is like night/day.

    4) last but not least, what is the unit that pixels are measured in? in other words when we say we have a pixel aspect ratio of 1:1, 4:3, 16:9, what we really are saying is that the pixels are 16 units wide and 9 units tall, but what exactly is that unit? if we had a 1440x1080 4:3 video and a 720x480 4:3 pixel, would the pixels of the first file be the same size as the pixels of the second file or do they just have the same ratio? i guess fundamentally what i'm asking is when we resize a video, without cropping, are we actually resizing the pixels along with discarding some pixels or are we just discarding pixels?
    Pixels are measured in .....drumroll....pixels. They are units. So 1440x1080 is exactly that; 1440 pixels wide by 1080 pixels high. The pixel aspect ratio is the w:h of the pixels. So a 1440x1080 4:3 video would be "stretched" to display as 1920x1080. The pixel shape measures 4w : 3h , which is non-square

    Display Aspect Ratio = Frame Aspect Ratio x Pixel Aspect Ratio

    So for your example
    16/9 = 1440/1080 x 4/3

    And a square pixel 1920x1080 1:1 would be:
    16/9 = 1920/1080 x 1/1

    Yes, when you resize, you are discarding pixels.
    Quote Quote  
  4. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by manono
    Originally Posted by deadrats
    1) assume you have an HDTV that can display 480i, 480p, 720i, 720p and 1080i...
    Assume no such thing. It'll display its native resolution, whatever that might be, and only that resolution. It may accept those other resolutions as input, but they'll be converted one way or another to its native resolution. Maybe the native resolution is 1080p. Check your manual, or give us the TV make and model number.
    i find this answer quite vexing, my tv is a 32" RCA (unfortunately where it's mounted at the moment makes getting the model number a tad difficult) and i could swear it's native resolution is 1368x768, furthermore my cable box has settings that allow you to choose between 480i, 480p, 720i, 720p and 1080i and switching between 720p and 1080i definitely results in a different picture with the 1080i being crisper and more detailed. likewise when i had a ps3 and had a game that allowed for the manual adjustment of output resolution switching from 720p to 1080i greatly improved the image quality with the detail increasing as one would expect, if what you say is true then the settings would have been meaningless as the output would have been scaled to native resolution and thus there should not have been any difference between the two outputs, yet there was/is.
    Quote Quote  
  5. Originally Posted by deadrats
    i find this answer quite vexing, my tv is a 32" RCA (unfortunately where it's mounted at the moment makes getting the model number a tad difficult) and i could swear it's native resolution is 1368x768, furthermore my cable box has settings that allow you to choose between 480i, 480p, 720i, 720p and 1080i and switching between 720p and 1080i definitely results in a different picture with the 1080i being crisper and more detailed. likewise when i had a ps3 and had a game that allowed for the manual adjustment of output resolution switching from 720p to 1080i greatly improved the image quality with the detail increasing as one would expect, if what you say is true then the settings would have been meaningless as the output would have been scaled to native resolution and thus there should not have been any difference between the two outputs, yet there was/is.
    Unless you have a CRT HDTV your TV is converting everything to 1368x768 like manono said. When it's downscaling 1920x1080i the resulting image has more real detail than when it's upscaling lower resolutions. That's why it looks better. Here's a little thought experiment for you. Say your HDTV accepted a 1x1 pixel video. So you take a 1920x1080 video and downscale it to 1x1 pixels. Send that 1x1 pixel video to your HDTV and let it upscale to 1368x768. How much detail do you think the picture would have? Do you think it would look as good as the original 1920x1080 image scaled down to 1368x768 by the TV?

    Actually your TV is probably scaling everything to about 5 percent bigger than 1368x768 and only showing you the center 1368x768 pixels to simulate overscan.

    And, by the way, many HDTVs can display 1080p input. Mine can display video input at 1920x1080p, 60 fps or 24 fps. 24 fps is displayed at 60 fps by duplicating frames in a 3:2 repeat pattern.

    Originally Posted by deadrats
    4) last but not least, what is the unit that pixels are measured in? in other words when we say we have a pixel aspect ratio of 1:1, 4:3, 16:9, what we really are saying is that the pixels are 16 units wide and 9 units tall, but what exactly is that unit?
    Until they are displayed, pixels have no size. They only have an aspect ratio, relative dimensions. The size of the pixels is determined by the output device. 1920 pixels spread over a 1 meter wide display means each pixel is 1/1920 m wide, about 5 mm. Those same pixels spread over a 2 meter wide display are 2/1920 m wide, about 10 mm.

    The 4:3 and 16:9 ratios that are common in video are not pixel aspect ratios but display aspect ratios (the final shape of the picture that is displayed). In general, any frame size can be displayed with any display aspect ratio. The relationship is:

    DAR = SAR * PAR

    where DAR is the display aspect ratio, SAR is the storage aspect ratio (relative frame dimensions), and PAR is the pixel aspect ratio (the relative width:height of each pixel).

    1:1 usually refers to the pixel aspect ratio. Ie, square pixels.
    Quote Quote  
  6. Originally Posted by poisondeathray
    1920x1080i30 is 30 frames per second, but 60 fields that are 1920w x 540h per second. Each field captures a different moment in time.
    Unless it's telecined film. Then that's not true. And that's why no advice can be given about how to deinterlace it, or even the best resolution to use, without an examination of the source - because maybe it just needs an IVTC (which makes it 1080p). Sure, maybe a true interlaced 29.97fps video source is best resized to 540p at either 29.97 or 59.94fps.

    As for what res to resize it for that particular (obsolete) TV, then 1368x768 (if that's what it really is). That way there won't be any further resizing done before it gets displayed. Of course, the resizer used makes a small difference.
    Quote Quote  
  7. Member
    Join Date
    Jul 2009
    Location
    Spain
    Search Comp PM
    Originally Posted by deadrats
    4) last but not least, what is the unit that pixels are measured in? in other words when we say we have a pixel aspect ratio of 1:1, 4:3, 16:9, what we really are saying is that the pixels are 16 units wide and 9 units tall, but what exactly is that unit? if we had a 1440x1080 4:3 video and a 720x480 4:3 pixel, would the pixels of the first file be the same size as the pixels of the second file or do they just have the same ratio? i guess fundamentally what i'm asking is when we resize a video, without cropping, are we actually resizing the pixels along with discarding some pixels or are we just discarding pixels?
    The notion of Pixel Aspect Ratio (PAR), although sometimes convenient, is actually misleading, and often leads to confusion.

    Real physical pixels on a display device are always 'square'. So when you display a video that has 'non-square pixels', a resizing conversion must be done somewhere along the way, whether it be in your DVD player, TV or software/firmware on your computer. The PAR is a convenient shorthand for saying 'when displaying this video, it must be resized horizontally:vertically in this ratio'.

    At a more technical level:
    Pixels as represented in a digital movie have no shape, they are just numbers, digital samples analogous to CD audio.
    The PAR is really the ratio between horizontal and vertical sampling rates (samples per unit distance).
    Anamorphic video ('non-square pixels') has a horizontal sampling rate that is different from the vertical one. For a given distance, you will have a different number of pixels horizontally than vertically.
    Resizing means changing the sampling rate(s), changing the relative positions of the points on the 'sampling grid', requiring calculation of new sample values by interpolation, etc (which is always approximate, so quality suffers).
    Quote Quote  
  8. Originally Posted by manono
    Unless it's telecined film....
    Whoops. Thanks manono good point, I shouldn't have assumed anything.
    Quote Quote  
  9. Originally Posted by Gavino
    Real physical pixels on a display device are always 'square'.
    Many devices have non-square pixels. For example, many low end plasma HDTVs have 1024x768 (4:3) native resolution but 16:9 physical dimensions.
    Quote Quote  
  10. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    ok, so assume i have a bunch of HDTV captures with a resolution of 1920x1080i, 20 Mbps, ac3 audio, mpeg-2 video and some of them display exaggerated interlacing effects when played back on a pc monitor and further assume that the subject matter is such that i can't just walk into best buy and purchase the blu-ray (basically i have a bunch of hi def music videos, including some live performances of carrie underwood performing with heart and taylor swift performing with def leppard, as well as some superbowls like the patriots/giants game) and assume i just want to get rid of those annoying interlace effects, what would be the best course of action.

    note, i'm more than happy to convert to 1080p at 20 Mbps, i couldn't care less about the size of the file, i have 3 terabytes of storage and i can add another 3 terabytes for about $200, so size, in this case, doesn't matter.
    Quote Quote  
  11. What is your playback device? Set it to 1080i output and the TV will deinterlace.
    Quote Quote  
  12. Banned
    Join Date
    Nov 2005
    Location
    United States
    Search Comp PM
    Originally Posted by jagabo
    What is your playback device? Set it to 1080i output and the TV will deinterlace.
    now i'm really confused, you guys just got through telling me that a HDTV will display it's native resolution and only it's native resolution, now you're telling me to set the output to 1080i?

    ok, now i'm even more confused, i just checked the settings on my tv and there are no settings for setting output resolution, which means manono is 100% correct, but that leads me to this question: this is a LCD TV, which means it can be used as a computer monitor, shouldn't that also imply that it should be capable of displaying a wide variety of resolutions? could it be that those settings are only available when it's set to operate as a computer monitor?

    let's ignore the tv for a second before i get a headache thinking about it, assuming i wish to play the files back only on computer, then how would you recommend i go about de-interlacing the files?
    Quote Quote  
  13. Originally Posted by deadrats
    now i'm really confused, you guys just got through telling me that a HDTV will display it's native resolution and only it's native resolution, now you're telling me to set the output to 1080i?
    Your TV will deinterlace the incoming 1080i and downscale it to the LCD panel's native resolution.

    Originally Posted by deadrats
    this is a LCD TV, which means it can be used as a computer monitor
    Not necessarily. If you can get your graphics card to output 1280x720p60 or 1920x1080i30 over HDMI the HDTV should be able to accept it.

    Originally Posted by deadrats
    shouldn't that also imply that it should be capable of displaying a wide variety of resolutions?
    Again, not necessarily. Although many do via a VGA or DVI/HDMI input. In most cases the incoming resolution is up or down scaled to the LCD panels native resolution. In some cases if the incoming signal is lower resolution than the LCD panel the image is centered and letterboxed and/or pillarboxed.

    Originally Posted by deadrats
    could it be that those settings are only available when it's set to operate as a computer monitor?
    Does it have a computer monitor setting?

    Originally Posted by deadrats
    let's ignore the tv for a second before i get a headache thinking about it, assuming i wish to play the files back only on computer, then how would you recommend i go about de-interlacing the files?
    Use a deinterlacing player, deinterlacing video decoder, or deinterlacing video renderer.

    If you insist on deinterlacing you can use the same techniques that are used for standard definition interlaced video. For telecined film you can inverse telecine back to 24 fps film frames. For interlaced video (eg, football games) you can deinterlace or bob. You may have trouble playing back 1920x1080p60 (ie, bob'd 1080i) video though.
    Quote Quote  
  14. Even your computer software player has deinterlacers available - not very good ones, maybe, but deinterlacers all the same. So I'm not real sure why you're so gung-ho to reencode these things with the inevitable quality degradation, especially given, from all I've seen, that you're encoding skills leave a lot to be desired.

    Examine the source. If 1080i and telecined, IVTC and resize. If 720p, decimate to the 'real' resolution and resize. If pure interlace, deinterlace with a good AviSynth deinterlacer and resize.
    now i'm really confused, you guys just got through telling me that a HDTV will display it's native resolution and only it's native resolution, now you're telling me to set the output to 1080i?
    Set whatever you use to get the video to the TV set to output 1080i (PS3, Popcorn Hour, the Western Digital thing, whatever). The TV set, of course, can't be adjusted. It'll only output 768p.
    Quote Quote  
  15. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by deadrats
    ok, so assume i have a bunch of HDTV captures with a resolution of 1920x1080i, 20 Mbps, ac3 audio, mpeg-2 video and some of them display exaggerated interlacing effects when played back on a pc monitor...
    That is due to not using a deinterlacing PC software player (or a player that uses the display card to deinterlace). But what does this have to do with your 1366x768 TV? You can use the TV as a 1366x768 computer monitor but in that mode it will not deinterlace.

    Problem is most computer display cards are poor sending interlace to the TV so the TV can do the deinterlace. This is true even when component analog output is used because most of the time there is a resize before D/A. This is why 1080i out of a cable box or DVD player can look much better than 1080i out of a computer.

    You need to think through the full chain.

    A 1080i/29.97 file played without deinterlace on a computer gets weaved to progressive, then is sent to the TV as DVI-D 1920x1080 then the TV downscales to 1366x768 (or larger for overscan). Then what you see is weaved then resized progressive. Yes it should look awful. This video should first be deinterlaced in the software player (e.g. Cyberlink or VLC) or in the display card (get PurevideoHD or AVIVO HD).

    If the video file is film source, IVTC would be used instead of deinterlace. IVTC results in a a 23.976p stream. Most likely, your RCA TV won't accept that unless it is converted back to 29.97 1080i or 59.94 fps 720p. So you need to use a hardware player that will inverse telecine, or use a software player and use the TV as a computer monitor.

    If your RCA TV has internal IVTC capability, then use a 1080i player like the Western Digital Media player and let the TV do the IVTC and downsize. In this mode incoming 1080i gets converted to 23.976p then is downsized to 1366x768 then frames are repeated 3 then 2 to 59.94 fps for display.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!