VideoHelp Forum




+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 36
  1. I'm looking at specifications of fullhd avchd consumer camcorders, and I cannot understand why they all have interlaced modes, wich are useless, since every lcd display is progressive.

    Some don't have any progressive modes, only interlaced fullhd, hd and sd modes. With such camcorders, I have to deinterlace the footage after editing, and before compression. For me, these camcorders don't make sense. Why shoot interlaced and then deinterlace into progressive, instead of shooting progressive to begin with.

    Some have 1080p and 720p but at 24 or 25fps. I mean, I was expecting at least 30 fps.
    Quote Quote  
  2. Most of them don't have full 1080p50 or 1080p60 because there are no broadcast standards for those formats. Broadcast standards are 1080i25 and 1080i30. People who have camcorders with 1080p50 and 1080p60 support often have trouble dealing with the video. Blu-ray doesn't support those formats, for example. This forum is full of threads regarding how they get their video on Blu-ray discs (convert to 720p50/60 or 1080i25/30).
    Quote Quote  
  3. Originally Posted by codemaster View Post
    For me, these camcorders don't make sense.
    For you, they don't. Well said.

    The legacy of broadcast formats will be with us for a while. Perhaps you should expand your search into DSLRs or other options
    Quote Quote  
  4. Originally Posted by codemaster View Post
    I'm looking at specifications of fullhd avchd consumer camcorders, and I cannot understand why they all have interlaced modes, wich are useless, since every lcd display is progressive.
    They are progressive - they only produce stream that can be named "fake interlace" to provide compliant with BD video stream.

    Originally Posted by codemaster View Post
    Some have 1080p and 720p but at 24 or 25fps. I mean, I was expecting at least 30 fps.
    More or less all consumer camcorders will provide 1080p30 (it can be 1440x1080p30) video, some can provide only 720p but then it should be 720p60.
    Quote Quote  
  5. Originally Posted by pandy View Post
    More or less all consumer camcorders will provide 1080p30 (it can be 1440x1080p30) video, some can provide only 720p but then it should be 720p60.
    No, there are many that shoot 1080p50 or 1080p60 now.
    Quote Quote  
  6. Originally Posted by jagabo View Post
    Originally Posted by pandy View Post
    More or less all consumer camcorders will provide 1080p30 (it can be 1440x1080p30) video, some can provide only 720p but then it should be 720p60.
    No, there are many that shoot 1080p50 or 1080p60 now.
    However those will not create interlaced 1080i30 but will provide 1080p60 video.
    Quote Quote  
  7. It is quite simple, do not record interlace footage, get 50p/60p camcorder.
    Quote Quote  
  8. Originally Posted by _Al_ View Post
    It is quite simple, do not record interlace footage, get 50p/60p camcorder.

    In this day and age, for sure

    Even many point and shoot models (not "camcorder") shoot 1080p60 now
    Quote Quote  
  9. Originally Posted by _Al_ View Post
    It is quite simple, do not record interlace footage, get 50p/60p camcorder.
    And then come here and read all the threads about how to deal with it. Blu-ray players won't play it. Most standalone media players can't handle it. Even many computers aren't up to it.
    Last edited by jagabo; 14th Dec 2012 at 10:18.
    Quote Quote  
  10. Well, we have to tell them then
    Media players will play it I bet, chipsets can do that now, unless fw is not screwed up. There is this thought that - my media player did not handle original 50/60p transport stream so it will not handle 50/60p in general - You encode it to easy profile and wrap it up into mp4 , better mkv and it might work. For BD we have to encode anyway. For somebody insisting on using BD there is AVCHD 2.0 here.

    Computer playback, yes, some beefier PC needed for original playback.
    Quote Quote  
  11. Next we'll hear: "Why won't consumer cameras do 4K?"
    Quote Quote  
  12. Why only 4K? Go Super Hi-Vision, 8K (7680x4320) http://en.wikipedia.org/wiki/Ultra-high-definition_television. And any gamer will tell you 60 Hz isn't good enough. We need 120 Hz or more.
    Quote Quote  
  13. just no interlace , broadcast and home sphere has little connection now, that consensus from the past seems to be gone but yet most think not, you can chop progressive to make it interlace but to do it the other way there is some magic needed and they don't tell you that to see better performance (better magic) you got to pay for it more ...
    Quote Quote  
  14. Originally Posted by _Al_ View Post
    just no interlace
    There is no interlace sensors (CCD or CMOS are progressive) - previously TV use also photosensitive lamps (vidicons and other - https://en.wikipedia.org/wiki/Video_camera_tube ) and those lamps can convert image in native - interlace way - nowadays this is not possible (OK - in theory you can emulate interlace analysis but there is no sense for this).

    Interlace is usually created virtually by syntax manipulation, sometimes for older cams it was present due fact that they (Sony?) use reduced horizontal resolution however they offer capability to capture anamorphic 1440x1080p60 which is converted to anamorphic 1440x1080i30.

    Originally Posted by jagabo View Post
    Why only 4K? Go Super Hi-Vision, 8K (7680x4320) http://en.wikipedia.org/wiki/Ultra-high-definition_television. And any gamer will tell you 60 Hz isn't good enough. We need 120 Hz or more.
    UHD especially SuperHI-Vision looks for me OK, 60Hz is OK however without problems difference between 100Hz and 60Hz can be seen (usually threshold is around 80 - 86Hz for human vision system) - we can perceive refresh rates higher than 60Hz.
    Quote Quote  
  15. Banned
    Join Date
    Oct 2004
    Location
    New York, US
    Search Comp PM
    Originally Posted by codemaster View Post
    I'm looking at specifications of fullhd avchd consumer camcorders, and I cannot understand why they all have interlaced modes, wich are useless, since every lcd display is progressive.
    Not everyone uses an LCD.
    Last edited by sanlyn; 25th Mar 2014 at 06:26.
    Quote Quote  
  16. Originally Posted by sanlyn View Post
    Originally Posted by codemaster View Post
    I'm looking at specifications of fullhd avchd consumer camcorders, and I cannot understand why they all have interlaced modes, wich are useless, since every lcd display is progressive.
    Not everyone uses an LCD.
    Yes, 0.1 percent of the HDTV population still uses interlaced CRT. So there should be no progressive modes at all.
    Quote Quote  
  17. Banned
    Join Date
    Oct 2004
    Location
    New York, US
    Search Comp PM
    Nevertheless, not everyone uses LCD's. The official percentage of CRT's is low, I realize, but my folks still have 32", 24" and 37" CRT's (one is HD), my brother-in-law in NY, several members of our local movie club, a number of my PC customers, a great many friends and relatives in 4 states -- excuse me, 6 states -- every hotel amd B&B I stayed in when I went to the UK, three rest homes I've visited in my home area, the Broadcasting Museum in NY City, a pro video processing lab in Corona, NY that I toured that works in both SD and HD, and on and on. And I still have a 24" I use often. From what I've seen, I think 0.1% is a little low. My own home has the CRT, a plasma, and one LCD. So in my house the percentage is 66.6% non-LCD.

    No, not all of those are HD. Not everyone cares that they aren't. And, yes, I know what you mean.
    Last edited by sanlyn; 25th Mar 2014 at 06:26.
    Quote Quote  
  18. Your argument is stupid. There are still people who use horses for transportation too. So there should be no automobiles? Should all our roads be designed to accommodate horse drawn buggies?

    The only benefit of interlacing is reduced bandwidth. All HD should have been progressive from the start. It was shortsighted for the industry to adopt interlaced modes for HD broadcast and recording. Interlaced HD made sense in Japan's early analog HD system. But not in the USA where HD broadcast was MPEG 2 compressed from day one. Once that was decided it became necessary for every TV to have a computer and frame buffer inside it. Once you have that it's much easier to convert progressive to interlaced than the other way around. Frame rate conversion also becomes trivial.
    Quote Quote  
  19. Originally Posted by jagabo View Post
    It was shortsighted for the industry to adopt interlaced modes for HD broadcast and recording.
    And many of us were hoping that HD would once and for all eliminate the PAL/NTSC frame rate nonsense. Boy were we naive.
    Quote Quote  
  20. I'd like to say, having CRT has basically nothing to do with choosing interlace or progressive HD camcorder. It does not matter.

    I can use interlace SD mode on particular camcorder model but then I do not need HD camcorder, no HD is involved.
    Quote Quote  
  21. Banned
    Join Date
    Oct 2004
    Location
    New York, US
    Search Comp PM
    I understand all that, jagabo. But a lotta people still own, use, and want CRT's. Yep, I think most new material looks fantastic on my HDTV, I'm in favor of all the advancements (except LCD, which has to be someone's idea of a video joke), and I'm with you on all the tech issues, but ...a large part of the planet hasn't gone with it. Yet.
    Last edited by sanlyn; 25th Mar 2014 at 06:26.
    Quote Quote  
  22. Originally Posted by sanlyn View Post
    a lotta people still own, use, and want CRT's.
    As _Al_ pointed out, the choice of CRT has nothing to do with interlaced vs. progressive. There have been both interlaced and progressive CRT displays. Progressive CRTs became common in the 80's. The TV industry just went with what it knew, interlaced video.
    Quote Quote  
  23. How interlace or progressive broadcast forbids me to have CRT? I receive digital HD from antenna, have converter box and watch it on CRT. Some football is 720p and it is just fine.
    Quote Quote  
  24. Banned
    Join Date
    Oct 2004
    Location
    New York, US
    Search Comp PM
    Yes, I know. Looks fine on all the good-quality CRT's I see, as well. Eventually the conventions of old broadcast standards will pass on and the world will be at peace again.
    Last edited by sanlyn; 25th Mar 2014 at 06:27.
    Quote Quote  
  25. Member
    Join Date
    May 2007
    Location
    United States
    Search Comp PM
    Overlooked in this conversation was bandwidth limitations. Even my camcorder from 2-3 years ago records 1080i60 @ 25mbps and has h.264 artifacts. 1080p60 means you're compressing 2x as much information in the same 25mbps bandwidth. Now obviously a lot of that will be semi-redundant information so it doesn't exactly scale 2:1, but the point is that several years ago the recording bandwidth + compression engines on the chips couldn't handle 1080p60 without producing something worthy of youtube instead of a blu-ray. That's why the cameras were 1080p24 or maybe 1080p30. IMHO both of those formats are terrible for capturing "real life" because of stuttering which leaves us 1080i60 back then and 1080p60 now.
    Quote Quote  
  26. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Originally Posted by codemaster View Post
    I'm looking at specifications of fullhd avchd consumer camcorders, and I cannot understand why they all have interlaced modes, wich are useless, since every lcd display is progressive.

    Some don't have any progressive modes, only interlaced fullhd, hd and sd modes. With such camcorders, I have to deinterlace the footage after editing, and before compression. For me, these camcorders don't make sense. Why shoot interlaced and then deinterlace into progressive, instead of shooting progressive to begin with.

    Some have 1080p and 720p but at 24 or 25fps. I mean, I was expecting at least 30 fps.
    AVCHD 1.0 does NOT support 1080p60/50, only AVCHD 2.0 does, and machines compatible with that spec are still rare.

    Cameras have interlaced mode for bandwidth budgeting and historical/legacy reasons, plain & simple. But hell, while it isn't optimal, well-done Interlaced isn't THAT bad. Personally, I find it easier to watch than 1080p30 - I need my motion to be finer than low-framerate-progressive can give me.

    You are wrong about 720p, though. Almost all cameras that do 720p do so at the full 50/60FPS (though some have option for doing film-style 24p). And if you have a camcorder made for Romania, you SHOULD be using 25/50, not 30/60 formats. It's not surprising you wouldn't find many 30p cameras where you are...

    Finally, and very importantly, you DO NOT HAVE TO DEINTERLACE BEFORE COMPRESSION. Most popular codecs (incl. h.264 & MPEG2) fully support Interlaced encoding. With those, a full end-to-end interlaced stream chain will probably maintain a better quality overall than one where deinterlacing has been applied. That should only occur when one MUST deinterlace because of the medium (youtube, for example).

    Scott
    Quote Quote  
  27. Member 2Bdecided's Avatar
    Join Date
    Nov 2007
    Location
    United Kingdom
    Search Comp PM
    Originally Posted by jagabo View Post
    The only benefit of interlacing is reduced bandwidth. All HD should have been progressive from the start. It was shortsighted for the industry to adopt interlaced modes for HD broadcast and recording. Interlaced HD made sense in Japan's early analog HD system. But not in the USA where HD broadcast was MPEG 2 compressed from day one.
    Using interlacing improves the efficiency of MPEG-2. At a given (reasonable) bitrate, at a given resolution, interlaced footage with 60 fields per second looks better than progressive footage with 60 frames per second when encoded to MPEG-2.

    4k will not be interlaced.
    HEVC does not support interlacing properly (no nice tool for it - just separated fields).
    Quote Quote  
  28. Member 2Bdecided's Avatar
    Join Date
    Nov 2007
    Location
    United Kingdom
    Search Comp PM
    Originally Posted by Cornucopia View Post
    Finally, and very importantly, you DO NOT HAVE TO DEINTERLACE BEFORE COMPRESSION. Most popular codecs (incl. h.264 & MPEG2) fully support Interlaced encoding. With those, a full end-to-end interlaced stream chain will probably maintain a better quality overall than one where deinterlacing has been applied. That should only occur when one MUST deinterlace because of the medium (youtube, for example).
    ...or where the player has no deinterlacing, or poor deinterlacing, or deinterlacing that's switched off by default and the intended audience may not know how to switch it on.

    You're safe with DVDs, and also when using MPEG-2 software players designed for DVDs to play back MPEG-2 files. However, anything else on a PC, and you're taking pot look. Especially playing back H.264. IME - YMMV!

    Cheers,
    David.
    Quote Quote  
  29. Originally Posted by 2Bdecided View Post
    Using interlacing improves the efficiency of MPEG-2. At a given (reasonable) bitrate, at a given resolution, interlaced footage with 60 fields per second looks better than progressive footage with 60 frames per second when encoded to MPEG-2.
    Oh... i've read that interlace will hurt compressibility of MPEG-2 approx 10 - 30% - which is quite easy to explain as interlace is a way to compress video signal bandwidth and it is based on human visual system capabilities thus MPEG-2 encoding already operate on compressed signal.
    http://www.stephanepigeon.com/Docs/pg.pdf

    Even this paper will favor progressive chain over interlaced at some cases (mostly for medium and high bitrate)...
    http://www.ics.ele.tue.nl/~dehaan/pdf/44_ICCE99_coding.pdf
    Quote Quote  
  30. Member 2Bdecided's Avatar
    Join Date
    Nov 2007
    Location
    United Kingdom
    Search Comp PM
    pandy,

    There are no real eyeballs in that first paper - it's all PSNR with certain deinterlacers.

    In the second paper, the bitrate isn't mentioned.


    "Everyone says" interlacing will hurt compression. With H.264, depending on the encoder and the deinterlacer, you can demonstrate that this is true sometimes. There are plenty of EBU papers showing that.

    However, at real world broadcast bitrates, you'd be struggling to deliver 50p using MPEG-2 and have something watchable with complex content. TBH it's not always watchable with 50i either.

    MPEG-2 just isn't smart enough to rendering interlacing a redundant piece of compression technology.

    Cheers,
    David.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!