VideoHelp Forum
+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 32
Thread
  1. Member Shibblet's Avatar
    Join Date
    Jun 2003
    Location
    United States
    Search Comp PM
    I went into a TV retailer today and looked at two LG Televisions...

    50PG20 50" (720p / 1080i) Plasma Display and
    50PG30 50" (1080p) Plasma Display

    Both of which were hooked up to their own Sony BDP-S350 Blu-Ray Player playing Pirates of the Caribbean.

    I stared for a long while at these two TV's, and I can see absolutely no discernable difference between 1080i and 1080p.

    I know when it comes to encoding, Progressive is much better for sizing compared to Interlace, but on playback, I just can't see a difference.

    Anyone notice the same thing?
    Quote Quote  
  2. Always Watching guns1inger's Avatar
    Join Date
    Apr 2004
    Location
    Miskatonic U
    Search Comp PM
    How were they hooked up ? A lot of stores use component simply because it is cheaper to wire a lot of TV in this way, rather than HDMI. If this is the case, both were actually getting a 1080i signal.
    Read my blog here.
    Quote Quote  
  3. The first is a 768p TV, the second a 1080p. Both are outputting progressive signals. You'd be trying to tell the difference between 1365 x 768 and 1920 x 1080. The player outputs 1080p. As gunslinger says, the connection could have something to do with what you're seeing, or the calibration (or lack thereof). And there's no way I'd be happy with a 768p TV with the 1080p sets down in price, even if the 1080p set is maybe $500 more than the other one. Don't fool yourself into thinking you can save some money because they both look the same to you. 768p is obsolete already.
    Quote Quote  
  4. How much difference you can see between a 1365x768 display and a 1920x1080 display also depends on the nature of the video being shown. If the video is very sharp you will see more of a difference. Look especially at things like horizontal window blinds in the background, or a soccer referee with a striped shirt where the size of each alternating line is close to the resolution of the 1080p display. You will see moire artifacts in the 720p display.

    Also try to view the 1920x1080 display with pixel-for-pixel mapping to get the sharpest image without resizing artifacts.

    Note that the 720p / 1080i reference on the first TV denotes the characteristics of the input sources it can display, not what it shows on the screen. It can accept a 1280x720 progressive source and a 1920x1080 interlaced source. Both are resized to the TV's native resolution of 1365x768.

    The 1080p display will accept both those inputs and probably a full 1920x1080 progressive source as well. It will upscale the 1280x720p to 1920x1080p, and deinterlace the 1920x1080i to 1920x1080p.
    Quote Quote  
  5. Member Shibblet's Avatar
    Join Date
    Jun 2003
    Location
    United States
    Search Comp PM
    They were hooked up with HDMI, and they went through the menu's and showed me the setup. They were set up appropriately.

    However, I don't know a single Blu-Ray player on the market that gives you pixel-per-pixel output.
    Quote Quote  
  6. Originally Posted by Shibblet
    However, I don't know a single Blu-Ray player on the market that gives you pixel-per-pixel output.
    The issue is whether the 1080p TV was displaying each of the 1920x1080 input pixels as one pixel on the screen or whether it was digitally scaling it ~5 percent larger the displaying the center 1920x1080 in order to simulate overscan (this is the default behavior of all HDTVs).
    Quote Quote  
  7. Banned
    Join Date
    Oct 2004
    Location
    Freedonia
    Search Comp PM
    I'd advise you to consider getting Samsung instead of LG. And get 1080p.
    Quote Quote  
  8. Panasonic is better than either Samsung or LG and far more reliable than those two put together.
    Quote Quote  
  9. Member Shibblet's Avatar
    Join Date
    Jun 2003
    Location
    United States
    Search Comp PM
    Originally Posted by jagabo
    Originally Posted by Shibblet
    However, I don't know a single Blu-Ray player on the market that gives you pixel-per-pixel output.
    The issue is whether the 1080p TV was displaying each of the 1920x1080 input pixels as one pixel on the screen or whether it was digitally scaling it ~5 percent larger the displaying the center 1920x1080 in order to simulate overscan (this is the default behavior of all HDTVs).
    Yeah, the Sony TV's have that option, but the LG Plasma's do not.

    I went back yesterday, and looked at the Sony's instead, still on the KDL52W4100 I cannot see any discernable difference between 1080i and 1080p. I also checked out the 52LG60 (Scarlet) and can't see a difference on those either.

    As far as 120hz is concerned, I really like it, especially on 24p output... but whoever created MotionFlow (it's an inbetween frame creator, supposedly for more fluid motion) needs their head beat in.

    I think I pissed off the salesman too...

    Edit - The LG's do have a "Just Scan" mode as well as a 16:9 mode. The "Just Scan" does give you pixel-per-pixel output. With once again, no discernable difference between 1080i and 1080p. This time, instead of "Pirates" I used "Cars" (The picture on Cars is frickin' amaizing.)
    Quote Quote  
  10. Originally Posted by Shibblet
    I went back yesterday, and looked at the Sony's instead, still on the KDL52W4100 I cannot see any discernable difference between 1080i and 1080p.
    You shouldn't really. The i/p issue here is only a matter of how the 24 fps material is transfered from the player to the TV. A decent TV will inverse telecine or smart deinterlace 1080i to restore the 1080p frames.

    Originally Posted by Shibblet
    As far as 120hz is concerned, I really like it, especially on 24p output... but whoever created MotionFlow (it's an inbetween frame creator, supposedly for more fluid motion) needs their head beat in.
    If motion flow worked well I would love it. Unfortunately it creates too many artifacts.
    Quote Quote  
  11. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    One issue not mentioned yet for plasmas:

    Plasma power consumption scales by pixels. You will find that a 1920x1080p panel uses approximately twice the power of a 1366x768 panel. Make sure the picture quality difference is worth the power (plus air conditioning heat load) daily expense.

    LCD panels consume power by backlight area and brightness setting. Soon LED backlights will replace fluorescents allowing dramatic power consumption reduction plus improved dynamic contrast.

    Plasmas still rule for true black levels and contrast.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  12. Member tmw's Avatar
    Join Date
    Jan 2004
    Location
    United States
    Search Comp PM
    Originally Posted by Shibblet
    I stared for a long while at these two TV's, and I can see absolutely no discernable difference between [720] and 1080p.

    Anyone notice the same thing?
    Based on the models you listed, I think the difference is the larger native resolution, and 720 vs 1080.

    Originally Posted by CNET
    We've done side-by-side tests between two 50-inch HDTVs, one with 1366x768 resolution (a.k.a. 720p) and the other with 1080p resolution, using the same 1080i and 1080p source material, and it was extremely difficult for us to see any difference.
    CNET agrees http://reviews.cnet.com/flat-panel-tvs/lg-50pg30/4505-6482_7-32863629.html

    However, when I went to Best Buy, looking at 37"-40" LCD TV's, I noticed an easy difference between the 720p and 1080p sets. The Samsung sets were really nice, quite better than others I saw. Could your observations be driven by plasma compared to LCD?

    I probably wouldn't easily notice 720p without the side-by-side comparison (unlike SD), but I could easily tell the difference watching LCD sets on the wall.

    In the end, you'll be the one watching it, right? I'm still hoping my 25" Magnivox from '93 will break so I can actually justify spending the money on a new set. Please, break already--it's over 15 years old, and I need the HDTV....
    Quote Quote  
  13. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by tmw

    However, when I went to Best Buy, looking at 37"-40" LCD TV's, I noticed an easy difference between the 720p and 1080p sets. The Samsung sets were really nice, quite better than others I saw. Could your observations be driven by plasma compared to LCD?

    I probably wouldn't easily notice 720p without the side-by-side comparison (unlike SD), but I could easily tell the difference watching LCD sets on the wall.
    There is an additional issue with current LCD models. 720p is currently offered only for entry budget models. What you are seeing is lowest level image processing tied to 720p (e.g. Level 3/4 for Samsungs).

    Mainstream plasmas have now moved up from 1024x768p to 1366x768p with top end models going to 1920x1080p.

    The image processors offered for larger 1366x768p and 1920x1080p models are usually upper range (e.g. level 6 or level 7 for Samsungs) for both panel resolutions. You are left comparing only panel resolutions. If you are watching from >2 meters back, you probably will see no difference at 42" and little difference at 50".

    You will notice large differences between Samsung Level 3 and Level 6/7 image processors. Most brands offer 3-5 levels of image processor quality in the current market.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  14. Member
    Join Date
    Jun 2005
    Location
    Portugal
    Search Comp PM
    For me it is better to choose a LCD TV rather than a Plasma

    Plasma does have the ability to show deeper blacks, and better viewing angles than LCD

    nevertheless there is more and more LCDs entering the market with viewing angles equal to or greater than some plasmas, and also they have more native resolutions than Plasma of similar size ( more pixels in the screen ), they consume less power..and they have long life

    Your first choice should be the size of your TV :

    You must stand at 3 or 4 times the size of your TV, for example :

    1" is approximately 2,54 cm

    So a 40" TV = 101,6 cm

    3 x 101,6 cm = 3 meters

    Therefore with a 40" TV your sofa/seat must be at 3 meters distant..at least

    And having a 720p or a 1080p, believe me or not....you will be deceived just because most of the broadcast signals are not HD

    As better and high the resolution is, more artifacts you will notice in a bad signal input

    cheers
    Quote Quote  
  15. A 40" TV at 9' will get small very fast. I'd go at least 46"-50" at 9'. I had a 42" at about 10', it only took a couple of months until I wanted a 50". The 50" at 10' is about right.
    Quote Quote  
  16. Member
    Join Date
    Jun 2005
    Location
    Portugal
    Search Comp PM
    1' = 0.3048 meters, right ?

    man, not for me, standing just 3 meters with a 46" or 50" TV....ouch my eyes

    I just can't face a huge TV very close, or in a movie theater (cinema), I will just focus a small part of the screen

    So, what is most recommended is a 3 x TV size distance ratio

    Is just because it is more comfortable to the eyes, and you can easily follow the scenes in full
    Quote Quote  
  17. I find about 1.5 times the TV's diagonal size is good for 1080p sets with good material. So a 50" 1080p HDTV should be viewed from 6 feet away.
    Quote Quote  
  18. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Different taste segments here.

    At one extreme you have SMPTE and THX theater seating specs that recommend 36 degrees (THX) or over 30 degrees (SMPTE) as the ideal theater seat. A 50" diagonal TV would be best viwed from 5.6 feet (THX) or up to 6.8 feet (SMPTE). This is for the immersive movie experience.
    http://myhometheater.homestead.com/viewingdistancecalculator.html

    Other tests show 1080p resolution will be lost if you sit back over 6.5 feet from a 50"screen.

    For smaller screen sizes or greater seating distance, 1366x768p screens are fine.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  19. The THX and SMPTE recommendation is too close IMO,I prefer viewing distance÷2.
    Quote Quote  
  20. Member
    Join Date
    Jun 2005
    Location
    Portugal
    Search Comp PM




    As far as I know there is a THX experience for Audio distance, and a movie one

    whatever, for a 40" there is a recommended distance more than 2 meters

    But this is something personal, because I once was put in the front chairs in a movie theater and I felt very uncomfortable....I moved my head all over to follow the movie

    That front chairs in a THX High Tech Movie Theater is always empty...people choose back seats

    3 x TV Size is what in practice is comfortable to the eyes, is a little more than the 2.2 meters on the graphic

    for example :

    40" TV and 3 meters viewing distance, for me is perfect ( 3 x 101cm ) 1" = 2.54 cm
    Quote Quote  
  21. Originally Posted by Delta2
    But this is something personal, because I once was put in the front chairs in a movie theater and I felt very uncomfortable....I moved my head all over to follow the movie
    That's why it's called immersive. If you want an immersive experience you need a wide angle of view. If you want a detached 3rd person experience you want a narrower angle.
    Quote Quote  
  22. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Optimal seating in most theaters is 1/3 to 2/3 back in the center.

    Back in the 70's-80's SMPTE urged members to rate individual theaters with a ~25 question list + postage free mailer.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  23. Member
    Join Date
    Jun 2005
    Location
    Portugal
    Search Comp PM
    Originally Posted by jagabo
    Originally Posted by Delta2
    But this is something personal, because I once was put in the front chairs in a movie theater and I felt very uncomfortable....I moved my head all over to follow the movie
    That's why it's called immersive. If you want an immersive experience you need a wide angle of view. If you want a detached 3rd person experience you want a narrower angle.

    I know what you mean, If you look at IMAX theater you get full first-person experience

    http://en.wikipedia.org/wiki/IMAX

    But that is a 360 angle camera shots, the experience is totally different, the hardware used is different

    It is not a flat widescreen as a regular movie theater, or at home

    I enjoy IMAX, but not a flat screen..if you understand what I mean, it is very uncomfortable for my eyes, unnatural
    Quote Quote  
  24. Member Shibblet's Avatar
    Join Date
    Jun 2003
    Location
    United States
    Search Comp PM
    Viewing Distance x 3 or 4 on a 50"? (12.5 Feet or 16 2/3 Feet )

    Wouldn't the difference between 720p, 1080i, and 1080p start to seem the same at that point? I mean, regardless of how good of eye sight you have. LCD, Plasma, DLP (POS), It wouldn't matter at that distance.

    I have to agree with Jagabo, distance x 2 sounds more realistic, or at least allowing yourself to benefit from the higher resolution.
    Quote Quote  
  25. Member
    Join Date
    Jun 2005
    Location
    Portugal
    Search Comp PM
    Originally Posted by Shibblet
    Viewing Distance x 3 or 4 on a 50"? (12.5 Feet or 16 2/3 Feet )

    Wouldn't the difference between 720p, 1080i, and 1080p start to seem the same at that point? I mean, regardless of how good of eye sight you have. LCD, Plasma, DLP (POS), It wouldn't matter at that distance.

    I have to agree with Jagabo, distance x 2 sounds more realistic, or at least allowing yourself to benefit from the higher resolution.
    I know, I gave myself the recommendations in a graphic that is easy to understand

    I tested again that distances in a supermarket, and for a HD movie I believe that a 2x +/- is a more immersive ratio experience, absolutely agree with you and them

    But, what about seeing CNN and Sky News and all that general broadcast programs ?

    Would you be immersive in a CNN News ? what the point ?

    I don't know, in my tests I had found that for a 46" I must be at least 3 meters distance for a immersive and for a general use

    But I don't have a personal experience in this because I don't have lucky to own a good TV
    Quote Quote  
  26. Member
    Join Date
    Aug 2003
    Location
    north east england
    Search Comp PM
    Originally Posted by manono
    768p is obsolete already.
    and how do you get that,look at the kuros they are still at 768p and i bet they give a better picture than the 1080p models your on about and even the pannys,yes they are bringing 1080p models out pioneer that is but you would have to have at least a 50" and sit about 5ft away to notice 1080p over 768p and it wouldnt be a wow factor aswell,there is still life in it yet the 768p models
    Quote Quote  
  27. How do I get that? Easy. They were obsolete (and inferior) even when they were producing a lot of them. DVD players upscale to 720p, 1080i, and/or 1080p. None upscale to 768p (except for a couple that can set the upscaling). Therefore no 1:1 pixel mapping. Either you output 480i analog from the player, go through the analog2digital conversion (with the resulting noise) and resize in the TV, or you output digital 720p or 1080i/1080p from the player and have the TV set do yet another resize. Yes, there are some other possibilities, but I'm trying to keep it simple. A decent player usually has a better resizer and keeps it digital the whole way (over DVI/HDMI), which results in a better picture. HD TV broadcasts are 720p or 1080i. No 768p. And if the TV happens to be 1024x768, as is/was only too common, then you're even worse off. Most Blu-Rays are 1080p; none are 768p. Even 720p, which is a much better choice if you don't get a 1080p set, is on the way out.

    Sorry but I don't know what a kuros is, but:
    i bet they give a better picture than the 1080p models your on about
    I wouldn't know as I had enough sense to steer clear of the 768p sets. I've owned a 720p Samsung and now have a 1080p Sony Bravia. As a way to future-proof himself, at least for a few years, and considering Shibblet seems interested in watching Blu-Ray movies, 1080p is the way to go, and I don't even see any reason for discussion unless the several hundred dollar higher price for the same sized set is a consideration. Remember that he's buying now. Maybe 768p was OK in the past, as a stopgap until the higher definition sets came down in price, but for the future? There's no question in my mind which a person should buy.

    Yes, the size of the set and the viewing distance is important also, but Shibblet was looking at 50 inch sets, plenty big, if viewing from the right distance, to enjoy the higher definition.
    Quote Quote  
  28. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    I'll say it again. Too many people confuse pixel resolution with picture quality when higher bit rate is more predictive of perceived image quality than is simple resolution. Resolution becomes a factor only for immersive viewing.

    BluRay 24p MPeg2 at 1920x1080p @ 25Mb/s may look better than 29.97 ATSC 1920x1080i @ typical 16Mb/s but that is because the ATSC is sending 20% telecined redundant frames* and suffers network/OTA transmission losses.

    HDCAM 1440x1080i or 1280x720p @ 144Mb/s looks far better than BluRay. DVCProHD 1280x1080i at 100Mb/s is superior as well unless you nitpick diagonals in detailed graphics or blow it up to theater screen size.

    One can argue that at low 16-25 Mb/s MPeg2 bit rates, we would have superior picture quality with ~1440x1080 or 1366x768 encoding and 1:1 display panels at normal viewing distances.

    1920x1080 was chosen in anticipation of much larger home screens and improved compression quality at broadcast sub 19 Mb/s rates. Five years into ATSC, VC-1 and h.264 promise 2-3x compression efficiency improvement over MPeg2. Maybe by 2012, these standards will be optionally added to ATSC as allowed formats. This would of course force us all into new ATSC tuners or cable boxes.

    My bet is the addition of VC-1 and/or H.264 to ATSC will not be used to improve picture quality at 1920x1080p, but instead will be used to cram more program streams into 19Mb/s. This is already the case where AVC is being applied to DirectTV satellite transmission and European terrestrial broadcasting.


    * I agree that redundant telecined frames do not consume much additional MPeg2 bitrate.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  29. Member FulciLives's Avatar
    Join Date
    May 2003
    Location
    Pittsburgh, PA in the USA
    Search Comp PM
    I sit about 6 - 7 feet away from a 51 inch 16x9 WS HDTV that is a 1080i CRT RP by Hitachi.

    Looks super great to me *shrugs*

    I'm used to it now but when I first got it I put KILL BILL VOL. 1 on via component 480p from a DVD player and WOW I remember being BLOWN AWAY by it.

    Biggest TV I had prior to this was a 27 inch CRT SDTV.

    - John "FulciLives" Coleman
    "The eyes are the first thing that you have to destroy ... because they have seen too many bad things" - Lucio Fulci
    EXPLORE THE FILMS OF LUCIO FULCI - THE MAESTRO OF GORE
    Quote Quote  
  30. Member
    Join Date
    Sep 2008
    Location
    United States
    Search Comp PM
    Will you receive 1080p on your TV if service provider is sending a 1080i signal?

    I have a Sony Bravia 1080p TV (HDMI hookup) but only get 1080i recption.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!