I'm looking at specifications of fullhd avchd consumer camcorders, and I cannot understand why they all have interlaced modes, wich are useless, since every lcd display is progressive.
Some don't have any progressive modes, only interlaced fullhd, hd and sd modes. With such camcorders, I have to deinterlace the footage after editing, and before compression. For me, these camcorders don't make sense. Why shoot interlaced and then deinterlace into progressive, instead of shooting progressive to begin with.
Some have 1080p and 720p but at 24 or 25fps. I mean, I was expecting at least 30 fps.
+ Reply to Thread
Results 1 to 30 of 36
-
-
Most of them don't have full 1080p50 or 1080p60 because there are no broadcast standards for those formats. Broadcast standards are 1080i25 and 1080i30. People who have camcorders with 1080p50 and 1080p60 support often have trouble dealing with the video. Blu-ray doesn't support those formats, for example. This forum is full of threads regarding how they get their video on Blu-ray discs (convert to 720p50/60 or 1080i25/30).
-
-
They are progressive - they only produce stream that can be named "fake interlace" to provide compliant with BD video stream.
More or less all consumer camcorders will provide 1080p30 (it can be 1440x1080p30) video, some can provide only 720p but then it should be 720p60. -
-
-
-
Last edited by jagabo; 14th Dec 2012 at 10:18.
-
Well, we have to tell them then
Media players will play it I bet, chipsets can do that now, unless fw is not screwed up. There is this thought that - my media player did not handle original 50/60p transport stream so it will not handle 50/60p in general - You encode it to easy profile and wrap it up into mp4 , better mkv and it might work. For BD we have to encode anyway. For somebody insisting on using BD there is AVCHD 2.0 here.
Computer playback, yes, some beefier PC needed for original playback. -
Why only 4K? Go Super Hi-Vision, 8K (7680x4320) http://en.wikipedia.org/wiki/Ultra-high-definition_television. And any gamer will tell you 60 Hz isn't good enough. We need 120 Hz or more.
-
just no interlace
, broadcast and home sphere has little connection now, that consensus from the past seems to be gone but yet most think not, you can chop progressive to make it interlace but to do it the other way there is some magic needed and they don't tell you that to see better performance (better magic) you got to pay for it more ...
-
There is no interlace sensors (CCD or CMOS are progressive) - previously TV use also photosensitive lamps (vidicons and other - https://en.wikipedia.org/wiki/Video_camera_tube ) and those lamps can convert image in native - interlace way - nowadays this is not possible (OK - in theory you can emulate interlace analysis but there is no sense for this).
Interlace is usually created virtually by syntax manipulation, sometimes for older cams it was present due fact that they (Sony?) use reduced horizontal resolution however they offer capability to capture anamorphic 1440x1080p60 which is converted to anamorphic 1440x1080i30.
UHD especially SuperHI-Vision looks for me OK, 60Hz is OK however without problems difference between 100Hz and 60Hz can be seen (usually threshold is around 80 - 86Hz for human vision system) - we can perceive refresh rates higher than 60Hz. -
Last edited by sanlyn; 25th Mar 2014 at 06:26.
-
-
Nevertheless, not everyone uses LCD's. The official percentage of CRT's is low, I realize, but my folks still have 32", 24" and 37" CRT's (one is HD), my brother-in-law in NY, several members of our local movie club, a number of my PC customers, a great many friends and relatives in 4 states -- excuse me, 6 states -- every hotel amd B&B I stayed in when I went to the UK, three rest homes I've visited in my home area, the Broadcasting Museum in NY City, a pro video processing lab in Corona, NY that I toured that works in both SD and HD, and on and on. And I still have a 24" I use often. From what I've seen, I think 0.1% is a little low. My own home has the CRT, a plasma, and one LCD. So in my house the percentage is 66.6% non-LCD.
No, not all of those are HD. Not everyone cares that they aren't. And, yes, I know what you mean.Last edited by sanlyn; 25th Mar 2014 at 06:26.
-
Your argument is stupid. There are still people who use horses for transportation too. So there should be no automobiles? Should all our roads be designed to accommodate horse drawn buggies?
The only benefit of interlacing is reduced bandwidth. All HD should have been progressive from the start. It was shortsighted for the industry to adopt interlaced modes for HD broadcast and recording. Interlaced HD made sense in Japan's early analog HD system. But not in the USA where HD broadcast was MPEG 2 compressed from day one. Once that was decided it became necessary for every TV to have a computer and frame buffer inside it. Once you have that it's much easier to convert progressive to interlaced than the other way around. Frame rate conversion also becomes trivial. -
-
I'd like to say, having CRT has basically nothing to do with choosing interlace or progressive HD camcorder. It does not matter.
I can use interlace SD mode on particular camcorder model but then I do not need HD camcorder, no HD is involved. -
I understand all that, jagabo. But a lotta people still own, use, and want CRT's. Yep, I think most new material looks fantastic on my HDTV, I'm in favor of all the advancements (except LCD, which has to be someone's idea of a video joke), and I'm with you on all the tech issues, but ...a large part of the planet hasn't gone with it. Yet.
Last edited by sanlyn; 25th Mar 2014 at 06:26.
-
-
How interlace or progressive broadcast forbids me to have CRT? I receive digital HD from antenna, have converter box and watch it on CRT. Some football is 720p and it is just fine.
-
Yes, I know. Looks fine on all the good-quality CRT's I see, as well. Eventually the conventions of old broadcast standards will pass on and the world will be at peace again.
Last edited by sanlyn; 25th Mar 2014 at 06:27.
-
Overlooked in this conversation was bandwidth limitations. Even my camcorder from 2-3 years ago records 1080i60 @ 25mbps and has h.264 artifacts. 1080p60 means you're compressing 2x as much information in the same 25mbps bandwidth. Now obviously a lot of that will be semi-redundant information so it doesn't exactly scale 2:1, but the point is that several years ago the recording bandwidth + compression engines on the chips couldn't handle 1080p60 without producing something worthy of youtube instead of a blu-ray. That's why the cameras were 1080p24 or maybe 1080p30. IMHO both of those formats are terrible for capturing "real life" because of stuttering which leaves us 1080i60 back then and 1080p60 now.
-
AVCHD 1.0 does NOT support 1080p60/50, only AVCHD 2.0 does, and machines compatible with that spec are still rare.
Cameras have interlaced mode for bandwidth budgeting and historical/legacy reasons, plain & simple. But hell, while it isn't optimal, well-done Interlaced isn't THAT bad. Personally, I find it easier to watch than 1080p30 - I need my motion to be finer than low-framerate-progressive can give me.
You are wrong about 720p, though. Almost all cameras that do 720p do so at the full 50/60FPS (though some have option for doing film-style 24p). And if you have a camcorder made for Romania, you SHOULD be using 25/50, not 30/60 formats. It's not surprising you wouldn't find many 30p cameras where you are...
Finally, and very importantly, you DO NOT HAVE TO DEINTERLACE BEFORE COMPRESSION. Most popular codecs (incl. h.264 & MPEG2) fully support Interlaced encoding. With those, a full end-to-end interlaced stream chain will probably maintain a better quality overall than one where deinterlacing has been applied. That should only occur when one MUST deinterlace because of the medium (youtube, for example).
Scott -
Using interlacing improves the efficiency of MPEG-2. At a given (reasonable) bitrate, at a given resolution, interlaced footage with 60 fields per second looks better than progressive footage with 60 frames per second when encoded to MPEG-2.
4k will not be interlaced.
HEVC does not support interlacing properly (no nice tool for it - just separated fields). -
...or where the player has no deinterlacing, or poor deinterlacing, or deinterlacing that's switched off by default and the intended audience may not know how to switch it on.
You're safe with DVDs, and also when using MPEG-2 software players designed for DVDs to play back MPEG-2 files. However, anything else on a PC, and you're taking pot look. Especially playing back H.264. IME - YMMV!
Cheers,
David. -
Oh... i've read that interlace will hurt compressibility of MPEG-2 approx 10 - 30% - which is quite easy to explain as interlace is a way to compress video signal bandwidth and it is based on human visual system capabilities thus MPEG-2 encoding already operate on compressed signal.
http://www.stephanepigeon.com/Docs/pg.pdf
Even this paper will favor progressive chain over interlaced at some cases (mostly for medium and high bitrate)...
http://www.ics.ele.tue.nl/~dehaan/pdf/44_ICCE99_coding.pdf -
pandy,
There are no real eyeballs in that first paper - it's all PSNR with certain deinterlacers.
In the second paper, the bitrate isn't mentioned.
"Everyone says" interlacing will hurt compression. With H.264, depending on the encoder and the deinterlacer, you can demonstrate that this is true sometimes. There are plenty of EBU papers showing that.
However, at real world broadcast bitrates, you'd be struggling to deliver 50p using MPEG-2 and have something watchable with complex content. TBH it's not always watchable with 50i either.
MPEG-2 just isn't smart enough to rendering interlacing a redundant piece of compression technology.
Cheers,
David.
Similar Threads
-
AHCI mode Worth Installing to Replace IDE mode?
By wulf109 in forum ComputerReplies: 13Last Post: 24th Aug 2012, 12:42 -
DVD Decrypter: File Mode total size double that of ISO read mode
By dare2be in forum DVD RippingReplies: 7Last Post: 13th Feb 2011, 12:21 -
Converting DV to H.264 and comparing interlaced/de-interlaced
By amirh1 in forum Video ConversionReplies: 5Last Post: 23rd Jun 2010, 09:16 -
What is FXP and MXP mode for canon camcorders?
By xaisoft in forum Newbie / General discussionsReplies: 3Last Post: 23rd Sep 2009, 16:48 -
Just What Does 'interlaced Mode' do in Magix Movie Edit Pro 12?
By gom in forum Newbie / General discussionsReplies: 3Last Post: 21st Jun 2009, 13:10