+ Reply to Thread
Results 31 to 60 of 60
My views are that 1080 is much better than 720 and that interlaced sucks, always.
1080/60p is what should be broadcasted.
Not some half backed 720 progressive or interlaced video.
And eh, you do realize that the article compares 720/30p not 60p with 1080/30i
In fact the undoubtedly brilliant mind warns against using 720/60p
Also, the expression is "half-baked" not "half-backed".
Why 1080i30 is used for TV instead of 1080p30: https://www.youtube.com/watch?v=Es_QgmiBBMM
The vertical resolution is lower, but 1080i30 has the same frame rate as 1080p60 with proper deinterlacing applied during playback. Some people in this forum will be aware that 1080p30 video is less smooth at normal playback speeds. When slowed down, 1080p's relative lack of smoothness becomes obvious even to someone like me who is not particularly sensitive to low frame rates.
“I am amazed that anybody would consider launching new services based on interlace. I have spent all of my life working on conversion from interlace to progressive. Now that I have sold my successful company, I can tell you the truth: interlace to progressive does not work!”. -Yves Faroudja
Thanks in part to Mr. Faruoudja's excellent work in this area, 1080i29.97 works well enough to satisfy the the general public, even if it does not work perfectly enough to satisfy him.
Enough people can perceive jerkiness when watching fast motion captured using progressive video at 30 frames per second that it isn't good enough to use when broadcasting live sports, the type of programming that generates the most advertising revenue. That is why progressive video at 30 frames per second isn't used for TV, and likely never will be used. When we go through another expensive (to both broadcasters and the public) digital TV changeover, this time to AVC or HVEC, 1080p60 is more likely although the bitrate used may not be enough to do it justice.
In the mean time, you can watch all the 1080p60 streaming video you like over the Internet. For some reason that seems to make you happy, even though it isn't all it should be either on account of missing detail and compression artifacts that result from encoding at overly low bitrates.
Last edited by usually_quiet; 5th Apr 2015 at 14:01.
Allan Tépper, now "firmly" on the high tech HD bandwagon, supporting the big 720p!
"Blackmagic studio cameras finally support 720p!"
Man, man, thousands of consumers are already on UHD while broadcast engineers are cheering support for 720p.
480>1080=Big Deal. 1080>4K=Meh.
Anyway, there is absolutely nothing wrong with 720p video, and it will continue to be used by a lot of people, simply because it still offers great quality, even when up-scaled to1080p resolution when played on a full HD TV.
I shoot 1080/50p video, and i would never dream of shooting anything in 720p mode unless i only had a camera that shoots in 720p mode.
Every single 1080/50p MTS video that i edit from my cams are smart render back to 1080/50p mp4, and i also output another conversion to 720/50p as well, using the x264 encoder in my editing software at 45% of the source file bitrate, and i can assure you that these videos turn out at very good quality.
The main reason for me down converting my 1080/50p videos to 720/50p mp4 is because of a lack of playback support for 1080p video.
Where i live there are still many many people who still have older HD tv's (such as those with the 1366x768 res) and they cannot play 1080p content on these TV's via the built in media player (USB) so generally i give people a copy of my videos in both 1080/50p and 720/50p mp4 just in case they have a HD tv that does not support 1080/50p.
I have 2x 55" full HD TV's that will not support my 1080/50p video files, the built in media player (via USB) does not play them, it will however play them as 25p and makes them jerky in panning scenes, so rather than put up with that, i can select to play the 720/50p video instead, which this TV will play in 50p mode, but has to upscale it to 1080p resolution.
This is why i now have a Western Digital Live Hdd media player connected to all 4 HD TV's in my home (2x 55" full HD and 2x 42" 768p TV's) so the 2 full HD TV's will play my 1080/50p mp4 videos in 50p mode, and my 2x 768p TV's will support 720/50p mode properly.
There will always be a need for lower resolution video, just as DVD format will never die, most people i know will still play DVD's on their HD TV's (1080p and 768p) via a DVD player and they play back perfectly, and while these people can get DVD support, there is no way they will ever move to Bluray, and personally i don't blame them.
When i move to 4k video, i will shoot in 4k @ 50p in the highest possible bitrate, and then down convert that video to 1080/50p mp4 so i can watch the video on my current 1080p TV's and still have it in amazingly good quality, because i will not be buying a 4k TV any time soon just so i can watch my 4k video.
My focus for wanting to shoot 4k will be for those who do have 4k playback support, but have the ability to down convert to 1080/50p for those who dont have 4k playback.
Everything is relevant, there will always be a need to down convert for various reasons, but there is nothing wrong with 720p video.
The irony is that i still get asked by some people if i can output my 1080/50p videos in DVD format so they can play it in their DVD player onto their HD TV, and i refuse to do it, simply because it looks like crap when my software does the down conversion to 720x576 @ 8Mbps, and in many cases these people then play the DVD back on their 768p or 1080p HD TV's, meaning the video is then up-scaled again, which then destroys the already poor quality again ?
That is why i give them the 720/50p mp4 file on a USB stick and tell them to play it via the USB port on their HD TV.
Last edited by glenpinn; 22nd Jul 2015 at 21:04.
Last edited by newpball; 22nd Jul 2015 at 21:31.
If i shoot my video in 1080/25p mode, i will get jerky playback in all of my video where there is a reasonable level of fast panning involved, or when there is fast moving objects moving past my lens (cars etc) but this is mostly resolved when i shoot in 50p mode.
Also, not all playback devices support 50p or 60p video, and that is a fact of life.
Oh, btw, i watched your 4k/24p video from your new panasonic G7 camera, so why would i be surprised that you made that amazing comment where you said Oh I see, so now 1080/30p stutters!
Appears that all your video was shot on a tripod, with very minimal or slow panning, so of coarse your video will not indicate any of the jerkiness like you would if you was panning fast or had fast moving cars zipping past your camera at close range.
I dare you to put that camera on a hand held rig and go out shooting some roaming video at 4k/24p or 30p and bring it back in here, or put it on Youtube and lets see just how smooth it plays back, aint gunna be as good as 60p that is for sure, especially if you pan faster than you did in your 4k video, or if your shooting fast moving action.
And for what its worth, the 4k/24p video you shot from that camera is horrible.
Last edited by glenpinn; 22nd Jul 2015 at 21:41.
I don't know how they're gonna broadcast 4K? The reason they broadcast interlaced is to cut the transmission bandwidth in half. It's faster to send half frames and let the TV hardware do the grunt work assembling it.
And if you prefer a Roku or other internet player, you better have 100Mbps fiber service to get 4K. Most people don't.
4K has a lot of downsides in the real world, good for Cinema though.
Seriously are people really so shortsighted that they record only up to the maximum they can currently broadcast? Material is not going to be used in the future? Sheesh!
Broadcast engineers are cheering because Blackmagic can now record 702p! What the heck? They should want to record 4k even if currently they can only put a 720p down convert downstream. It is comparable to the idiocy of the seventies having "brilliant engineers" thinking it is a good idea to videotape valuable programs useful for future generations instead of using film because 'we don't need the resolution' so now we have a heritage of faded 70's edge enhanced rubbish.
The totally shortsighted attitude is just unfathomable!
Last edited by newpball; 22nd Jul 2015 at 22:18.
Another addition, do you understand that engineer only put together what managers and salesman tell him to do? Are you comprehending this?
Yeah, Al's right. NeutBalls is trolling. hahaha.
I have shot lots of video in 50p mode where a car has driven past me very close while i am slowly panning, and when i play back that video i do see some mild jerkiness as that car goes past me, so even 50p has its limits sometimes, but it sure as hell beats shooting my video in 25p mode thats for sure, especially with the type of video that i usually shoot in, which is roaming, using one of my wonderful custom made hand held camera rigs or my monopod.
If i was to shoot video on a tripod where i am not going to be panning, or not panning very fast, then i can get away with 24p/25p or 30p if i really had to, but my cams don't shoot 24p or 25p anyway, nor do they shoot 720/50p either, but if i had to i would have no hesitation at shooting 720/50p if it was a decent quality camera.
And yes, i do shoot all my video in 50p mode, and yes 50p mode helps eliminate these issues when shooting roaming video in 25p or 30p mode.
You have lost all credibility with me now, so my time with you and anything else you have to say is done, just wasting my time, and i dare say most of the others would agree.
Last edited by glenpinn; 23rd Jul 2015 at 00:20.
Usually cinematographers try to hide the effects of the low frame rate, but among many other examples there is an infamous panning shot in the Bruce Willis movie RED (at 4:05 apparently).
The above is newpall again demonstrating his ability to jump to a conclusion and decide it's iron clad fact when it's got very little to do with reality.
Videotape was used because it was more economical/re-usable. Nothing to do with quality as such, although it was certainly good enough for broadcast in the 70's. Before videotape, programs often weren't recorded at all. They went out live to air and that was it. FFS, in the 70's there was no such thing as re-runs. Often programs were lost because tape was re-used or eventually discarded. Nobody ever expected a TV program to be shown more than once. Yet here's newpball, with all the benefit of years of hindsight still managing to once again get it completely wrong. It's almost impressive.
I'm sure sometime in the future when our children are watching holographic images in 3d and 128k resolution, someone will be posting in a forum complaining about the lack of foresight shown by only broadcasting in 4k. It's inevitable......
Last edited by newpball; 23rd Jul 2015 at 10:17.
Because there's still 4:3 TVs out there.
The camera doesn't record. It does however support another format that could be used for broadcasting. It's all economics. Something you don't seem to be able to comprehend. Nobody's going to broadcast or record in any newpball approved resolution with a newpball approved frame rate if it's going to cost a lot of money and without potentially increasing their profit. Nobody's going to use a format if it'll cost so much today they mightn't be broadcasting tomorrow. The progression to HD and UHD will take time for economic reasons. Get over it.
I read the article and was struck by the fact that as I was being told to browbeat myself for having used the term FullHD in the past on this forum and in other places, the sidebar banner ad was for a 4K video capture device. I suppose I should just save my money and stick with my thin raster, non-AVID, HDV camcorder.
Two things to note: 1) companies resist spending money to upgrade hardware. It doesn't matter if 4K hardware is readily available ( debatable ) or if it's cheap. If the current hardware is working, they will keep it. 2) most broadcasters are concerned with bandwidth. They would rather offer more channels with less quality, then fewer channels with better quality. If you increase the broadcast resolution, you eat bandwidth. (or you can upgrade all the hardware to use a better codec, and again, see item 1).
It doesn't matter what you, or I, or some survey determines is "better". It boils down to money. And they won't be in any hurry to spend money and upgrade until the hardware to support it has hit mainstream and dominated the market. Aside from demo store models, I have yet to see a 4k display and I don't know anyone that actually has one yet.Google is your Friend
I don't think the OP realizes that most cameras don't capture the full detail of the maximum resolution. Ever noticed most Blu-rays look blurry and lowpassed at 1080p that it looks like half the resolution upscaled? Exactly.
Only Blu-rays that have been mastered at 4K are 1080p and look good. Ones mastered at 1080p look good at 720p and so on.
It's the same BS as with high megapixel photo-camera's (unless you need this resolution for special purposes).
Same for UHD which is only usefull for theatres with big screens but compeletely useless for the normal home tv.
"Another 720p advantage for news networks is that they could often use an existing microwave link."
Tepper also states "...many sports and some news networks preferred (and continue to prefer) to have a very high temporal resolution (50 or 59.94 progressive frames) with less spatial resolution (less pixels). Many of them insisted that with a screen under 60" measured diagonally, test subjects couldn’t perceive any difference between 1280x720 and 1920x1080 at couch distance."
That is truly a disingenuous argument if I have ever heard one.
OTOH, movie studios are pushing for higher and higher resolutions for exactly the same reason, money. They need us, the consumer, to get off our couches and go to the movie theater. But I am not going to do so if the experience is only marginally better than my TV.
As far as your not knowing anyone that has a 4K TV, I am afraid you are straying into selection bias which is best avoided.