VideoHelp Forum
+ Reply to Thread
Results 1 to 14 of 14
Thread
  1. Hi guys, I have a Panasonic HDC SD5.
    This cam has two record sizes, 1920x1080 and 1440x1080. Logicaly 1440 has more wide screen.
    But when I look at both videos with Windows Media Player Classic, then both look same.
    So I mean there is no distortion in the width or height at all.
    But I figured out that the 1440 video properties show one time 1920 and one time 1440.

    But what is going on here? I am trying to figure it out but cannot. Please see below files.

    Advice please.
    KR
    KC



    Last edited by khan.cross; 12th Aug 2010 at 09:54.
    Quote Quote  
  2. Frame size does not equal display aspect ratio. Many codecs and containers support non-square pixels or display aspect ratio flags. With those any frame size can encode any aspect ratio.
    Quote Quote  
  3. Originally Posted by jagabo View Post
    Frame size does not equal display aspect ratio. Many codecs and containers support non-square pixels or display aspect ratio flags. With those any frame size can encode any aspect ratio.
    what?
    Quote Quote  
  4. Originally Posted by khan.cross View Post
    what?
    The video file includes a flag that tells the player what shape to display the video.
    Quote Quote  
  5. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Same way 720x480 or 720x576 DVD displays 16:9.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  6. http://www.iwantvideo.tv
    Join Date
    Jul 2009
    Location
    United States
    Search Comp PM
    To keep it simple. For video purposes 1440 and 1920 are the same. One uses pixels that are square in size, so it requires 1920 of them to fill the space with video. The other uses pixels that are rectangular (wider pixels) so the video only requires 1440 of them to fill the screen. When you playback the video in most players or TVs, they recognize which format and adjust the way they display the video to fill the screen.

    Ray The Video Guy - Host of 'I Want Video!' http://www.iwantvideo.tv
    Quote Quote  
  7. so in clear words, all record modes are same. it does not matter if 1920 or 1440, output is always the same.
    correct?
    Quote Quote  
  8. aBigMeanie aedipuss's Avatar
    Join Date
    Oct 2005
    Location
    666th portal
    Search Comp PM
    not quite the same. the output display is the same size, but the source files are different. 1920 uses square pixels, 1440 uses wide 4:3 shaped pixels. the recorded bitrate is probably lower also. for the best overall quality use the camera's highest 1920 setting. the file size will also be larger.
    --
    "a lot of people are better dead" - prisoner KSC2-303
    Quote Quote  
  9. Member Alex_ander's Avatar
    Join Date
    Oct 2006
    Location
    Russian Federation
    Search Comp PM
    Display proportions are the same, 1920 version gives more detailed, horizontally sharper image.
    Quote Quote  
  10. http://www.iwantvideo.tv
    Join Date
    Jul 2009
    Location
    United States
    Search Comp PM
    Relatively the same. They will both fill up your HDTV and look identical. As mentioned above, the extra pixels will make the video a tiny bit sharper, but I can't imagine it being noticeable on any TV.

    Ray The Video Guy - Host of 'I Want Video!' http://www.iwantvideo.tv
    Quote Quote  
  11. Originally Posted by Alex_ander View Post
    Display proportions are the same, 1920 version gives more detailed, horizontally sharper image.
    that's why I am asking because I cannot see that much difference in the qualities. Only the 1920 resolution has more sharpness but for me it is too much sharpness because the dots are visible much more then 1440.
    Quote Quote  
  12. Originally Posted by khan.cross View Post
    the 1920 resolution has more sharpness but for me it is too much sharpness
    That's more likely a difference between the camcorders and type of compression used.
    Last edited by jagabo; 13th Aug 2010 at 09:49.
    Quote Quote  
  13. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    I hope this doesn't complicate matters further but when you start relating 1920x1080 vs. 1440x1080 to picture quality,
    you also need to include bit rate and compression codec in the discussion.

    Most AVCHD camcorders use a 24 Mbps bit rate for 1920x1080 and 16 or 17 Mbps bitrate for 1440x1080. 24/16=1.5
    so the bit rate per horizontal pixel is equivalent. The result is proportional increased horizontal resolution with similar
    compression artifacts.

    As implied above, many camcorder optics and/or HDTV sets will show little difference between 1920 and 1440 settings.
    But a camcorder with high quality optics/sensor and processing will show a difference when displayed on a high end
    1920x1080 HDTV.

    Home camcorder recordings often have long term application. In 5 years when you get a better HDTV, you are more
    likely to see the difference. The situation is similar to shooting EP vs. SP with VHS.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  14. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Next level of discussion is whether horizontal resolution or bit rate is more important to picture quality. Increasing bit rate
    will decrease compression artifacts.

    So in this context many producers would rather shoot 1440x1080 at 24 Mbps or 50% more bit rate per horizontal pixel
    for fewer compression artifacts.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!