No serious broadcaster would cover sports at 30fps. Remember 1080i has 60 field per second motion increments as does 720p 60 fps (59.94 actually).
ABC, ESPN and FOX sports always use 59.94p cameras for HD. DVCProHD, HDCAM and XDCAM-HD camcorders are all switchable 1080i to 720p.
Typical truck for ABC or ESPN is one of the NEP 'Supershooters' with Thomson LDK 6000 cameras all switchable 1080i to email@example.com. They can also do 1080p at 23.976, 24, 25 or 29.97 fps.
+ Reply to Thread
Results 31 to 47 of 47
Originally Posted by edDV
Originally Posted by jagaboRead my blog here.
Originally Posted by yahoobuckaroo
Originally Posted by olyteddy
No, I'm not fooling myself. I'm saying that a 480i source looks better on my VGA monitor than on a TV. I'm sure a lot of the reason for that is the finer dot pitch of the monitor. I'm also saying that I can see flicker on the VGA at 60 Hz and not at 85 Hz. I just slowed refresh down to 60 to verify that. I am aware that the source is 480i and don't expect 80 charachter per line text, but interpolated and de-interlaced 480i, to me, is an improvement. As a former video engineer (public access TV, but TV just the same) I have, at times worked with high end NTSC monitors at close distances and this is easier to look at. No Fooling.
yahoobuckaroo wrote:That would be fine for video. But, while I can't recall the details anymore, I remember reading once upon a time about a test done in the 1960's by one of the big motion picture studios where they shot a short dramatic movie on either 65 or 70mm film at a very high frame rate, (I'm thinking this might have been shot on Showscan which used 70mm at 60fps) and people actually walked out on it. They hated the way it looked. It was too realistic and very much like video. It would be great for sports though.
There comes a point where things just don't look any better with film just like with audio. We were already past the range of human hearing with 24 tracks on 2-inch tape in the old days. Putting 24 tracks on 4-inch tape wouldn't have sounded any better to anyone but dogs. I don't know where the point of diminishing returns is with video or film, but I wouldn't think that 72fps would look much better to anyone but Superman.
Have you got a camera that will take 4:4:4 color photos? Try bringing a snapshot through Photoshop or whatever, and saving a copy as a 4:1:1 picture. I've never met anyone that could tell the difference.
"Then why all the rush to interpolated 120Hz (100Hz PAL) for home movie display?"
So they can sell us more stuff I suppose.
"Live big screen video using hand held cameras would be inadequate at 48p."
I can't imagine why. Man that's a very fast frame rate. Nearly everything we see is shot at either 24p or 60i/30p in the USA, handheld or otherwise. It looks great. Maybe I'm just not thinking straight today, but I can't think of a reason that a faster frame rate would make handheld footage look better unless you were panning the shot and from very close range. Like if you were watching a car come flying by ten feet in front of you. But how often do we see something like that? Generally, if we're watching race cars, the cam is gonna be pretty far back. About the only time we get in close is with humans like in a football game, and then only with slow-mo close-ups for a catch or something. Humans move so slow that I can't see why anything faster than 30p would help. But like I say, maybe I'm just not thinking of something obvious.
"Today 72p video would take excessive bitrate..."
A Thompson Viper could handle the kind of datarate you're talking about. It can create about 2 or 3 times higher datarates than even film, but good luck finding a storage device capable of keeping up.
"Is that why they all walk out on 60 Hz HDTV? Oh wait, they don't."
What's the 60hz playback refresh have to do with the recorded frame rate? which could be anything. And what dramatic movie did you ever see that was recorded at over 30fps?
"I thought the audience walked out on the high film frame rate because it was too realistic. Now you're saying they can't tell the difference? Must have been an audience of Supermen, eh?"
A little common sense please. Going from 24fps to 30 is going to make film look a lot like video. Going beyond 30fps isn't going to make it look any better or worse. If you had actually ever seen anything recorded over 30fps then you would know it doesn't look very different from 30fps. Again, it's called diminishing returns....
"The point where you can't tell the difference anymore is somewhere over 72 fps."
No, it's 30fps. Not that you've ever seen anything over that anyway unless it was a sporting event, and even then it was probably just the slo-mo cams.
"Personally, I hate the jerky 24 fps film look. I'd take 60 or 72 fps any day."
Oh please... anything over 24fps simply doesn't look like film anymore. It looks like video. It's like saying you think Masterpiece Theatre is better looking than How Green Was My Valley. The slower speed of film is the biggest part of what puts you in another world. Only at 24fps can you believe in fantasy or the old romances. 24 fps makes you believe those other worlds exist. Anything 30fps and over eliminates that and turns your movie into the David Letterman Show.
Some of you are about two steps away from believing in Space Aliens secretly ruling the Earth.
I think you're fooling yourself. People do it all the time. I'm an old audio guy and a computer repair tech, and it's amazing the claims I've heard people make that I knew didn't have a word of truth in them. A great example: Several years ago when we first started using computers for DAW units, I belonged to a few DAW user groups. I bet I read a hundred messages during a 2 or 3 year period where people (and often so-called audio pros who owned large recording studios) talked up a particular recording app because they claimed it had a better "wave engine". It was all I could do not to laugh out loud. A wave file is a wave file. They are all recorded the exact same no matter what app you're using. Now some may do some things differently in manipulating that audio afterwards, as in dithering with slightly different algorithms, but in general even the post manipulation isn't different enough to be audible.
MOST people are absolutely convinced they can see and hear things they absolutely cannot.
Another example is 24-bit audio. I used to hear people claim all the time that they could hear a difference between 16 and 24-bit audio. But there's simply no difference to be heard between 16-bit w/ oversampling and 24 bit. Both will record to the extremes of human hearing and beyond. 24-bit offers a slightly lower noise floor, but we were already so close to being dead quiet at 16-bit that the difference is generally not enough to hear most of the time. You get more headroom at 24-bit, but unless you're one of the few people that's recording a symphony orchestra with extremely dynamic passages, you won't benefit from it, and I'm not even sure that they do given that I've got some old 16-bit Tellarc recordings from the 80's that have extreme volume dynamics that would compare with anything recorded at 24-bit today. The only thing 24-bit has going for it is that you can make several more destructive changes to the file before there's any noticable degradation; however, I can generally make at least 6 changes to a 16-bit file without any degradation already. How many more do I really need? People are starting to wise up to the fact that we recorded with 16-bit ADAT workstations just fine for over a decade before people at the manufacturing companies realized they weren't able to sell many more of these units after everybody already had them. Then suddenly they decided we needed more bits, and they could sell new workstations/recorders/software to the same old customers if they just told them it was something new and better....
BTW, we all know that humans can hear roughly from 20 to 20k hz. What you probably don't know is that this is true only of children. By the time you're 18 you can't hear much of anything past 16k and it only gets worse from there. But people have great imaginations as to what they think they can hear...and see.
"Flicker is noticed more with computer displays due to the white sheet of paper model. Flat white areas appear to flicker more than dark."
I don't see it so much, but I can feel it sort of. That is, since I've gone to an 85 refresh rate on my box, I have less headaches.
And that has what to do with the frame rate?
And what does a 60 "hertz" refresh have to do with 60 frames per second? It has nothing to do with the frame rate something was recorded at.
That's what they broadcast at--not what they run their cams at except for slo-mo. Moron....
You can think that if you want to, but you're generally watching 60i not 60p, and if it is progressive frames in a sporting event then it's generally gonna be 30p--not 60p except for slow-mo. Maybe 60p for every sporting event will happen eventually. You won't know the difference except during slow-mo playback unless you're Superman. I'm only Batman.
You know, as long as the slow-mo cams are shooting 60p, then you can claim to broadcast 60p and no one will think anything of it. Quite a slight of hand.
That whole "Showscan 60fps test" business...
Did you even do a google search to find out more about what that was about? A quick search easily gave me this info, which reads with a WHOLE lot more authority than your rumours:
"Another movie he directed, Brainstorm, was originally designed to highlight a projection technology he was promoting. The movie was about recording experiences; in the film, the 'recorded experiences' were going to be played at 48 frames per second (as opposed to the standard 24 frames per second). His feeling was that audiences had gotten used to the look of standard film; people were subliminally aware of the 24 frame per second flicker. By projecting the film at twice the standard rate, the flicker was harder to perceive, and the film would be given that much more immediacy. However, it was impossible to convince the theater owners to install the special projectors to show the 'recorded experience' sequences; the movie was released as a conventional film. Fortunately, the story stood up well on its own.
Partly correct: Showscan runs at 60fps--when he was doing test work for 2001 (I believe it was on the spacewalk footage, but it's been years since I read up on the process and things have blurred for me) he accidentally ran a sequence at 60fps and the uniform reaction in the booth was shock that he'd gotten a 3D effect with a conventional camera. He did dream up Brainstorm to use the process, but the studio he'd been working with didn't want to install the only projector on the market that would run 60fps for an extended period without breaking for just one film and he ended up taking the story idea with him to MGM--as I remember it he had to do a sizable payout to get it from his previous studio. Even then MGM wouldn't go for it. (Compare and contrast with the UA chain which installed a highly sophisticated 3D system in a number of their theaters [including Seattle] and ended up offering a couple of million dollars as a bounty to get someone, ANYONE to shoot a feature using it.)
There were several Showscan shorts shot (the one at Expo 86, the one on magic tricks starring Christopher Lee) and it looked like he'd finally get them in widespread distribution when he signed a deal with a major restaurant chain to add a Showscan theater to each of their restaurants--but the chain was Chuck E. Cheese and the deal was about two months before they imploded.
Of all the reviews I read about the process the one that stuck with me was the one in Cinefantastique that said makeup technology was going to have to improve greatly if the process was going to catch on--you could apparently see every pore on Christopher Lee's face."
4:4:4 vs 4:2:2 vs 4:1:1 etc...
Do you WORK in this industry? Yes, to consumers at the end there's not a WHOLE lot of difference, but if you deal with this kind of footage on a regular basis, you CAN tell the difference. Why do you think they have popular 4:2:2 and 4:4:4 recording formats and post houses dedicated to color correction. And I guarantee you nearly anybody can tell when 4:1:1 footage has been used in greenscreen compositing, as opposed to 4:2:2/4:4:4 material.
High FPS-recorded footage...
The GREAT MAJORITY of people can and do appreciate (---given all else being equal, esp money---) BETTER QUALITY. Whether it's Hi-Def vs. Std. Def, HDR vs. LDR, Uncompressed vs. Compressed, Hi FPS vs. Std. FPS, Stereo3D vs. 2D, Surround vs. Stereo vs. Mono, Hi Samplerate vs. Std. Samplerate, Hi Bitdepth vs. Std Bitdepth, etc. It usually just that people don't really get a good, objective A/B comparison put in front of them, so there's no point of reference.
You don't understand why Hi FPS makes sense, well have you ever heard of Motion aliasing and the Wagon-wheel effect? Obviously, Std. FPS is still LOW in the global scheme of all things in motion.
I know everytime I watch the Indy500, there are cameras right in there, and it sure would be great to have LESS motion blur and judder artifacting, I want to be able to feel like I'm the one actually at the wheel... Hi FPS recording/playback would do that.
Thompson Viper, data rates, etc...
Good camera. Why is it not in everyday use?----EXPENSIVE!!
Hard to find data drives that can keep up?---Not me, I use RAID systems.
60FPS (not Hz), 72FPS rates on cameras...
WRONG!!! They DO record at 60 or 72, etc frame rates. If they expect to do slo-mos, yes they will slow them down to slower rates, but HDTV DOES present at 60P for 1280x720 material. And if you really needed something faster, theres the RED camera which can do 120FPS, and industrial Firewire cameras which (once set to lower rez or bitdepth) can do up to ~480FPS. You're the one getting record/playback/refresh rates mixed up...
Going 24FPS to 30FPS or beyond...
Going 24FPS to 30FPS doesn't make a huge difference. We notice a difference because video REALLY runs at 59.94i or 50i, which is (albeilt interlaced with lower resolution) over 2x the rate. Big change = quite noticeable (that's also the simple reason why ALL film projectors will double-flash the image=48FPS=better).
MANY psycho-physiological tests have been done for MANY years that show the eye/brain's persistence of vision and where flicker and motion begin to look smooth, which is ~54 FPS and then finally gets close to imperceptible (at least as far as humans are concerned) at ~96-100FPS. Diminishing returns may enter into it after ~120-144FPS or so, but not before.
Dramatic movies recorded over 30FPS...
Obstacle here isn't why not artistically or technologically, its pure economics for the studios and theatres. (and yes, I have seen faster recorded/run movies at specialty venues and they look AMAZING! FLUID!!!).
What "looks like film vs. looks like video"...
There are SOOOO many variables here that affect the "look" of a piece toward either film or video stylings...
Dynamic Range/Latitude and particular Gamma curves
Telecine and it's accompanying Motion Judder
Lighting (see the 1st item in this list)
Color subsampling or NOT
Resolution and/or "Grain"
Differences in Depth of Field
Whoopie, you chose one, which can easily be tweaked/cheated (see 24P HVX-200, or Todd-AO). Learn how the others are just as important in this value judgement contribution...
Audio guy, computer repair tech, DAW claims...
Gee, I'm an old audio guy and computer repair tech too. And I understand what they were actually talking about in those "quality" claims. For example, Many DAW's did (AND STILL DO) claim less sample clock jitter (because on hardware sync), many claim DSP calculation with 24/32/48/64bit Integer and 32/64bit Floating point mix engines, and they noticeably present a better quality output (easily A/B'd), often devoid of 16bit's "grittyness" or "granularity". Sometimes it (16bit) is less noticeable and only presents itself over a longer period of time as a form of listener fatigue. Also, keeping things in the 24bit or > domain greatly helps in avoiding noise and distortion buildup when combining multiple tracks (as in 32-96). Maybe you don't work with as many, but I do, and it's noticeable otherwise. Much much better reverb tails as well.
Same thing with Hi Samplerate audio. Big difference if you work with audio alot. Especially concert/symphonic/organ material and Sound Effects. Sound Effects, I say? Yes, Hi sample rates have much better 3D imaging for surround/binaural effects. These things are easy enough to show in a double-blind test, with statistical regularlity (which backs up all impirical and mathematical evidence). Yours is the kind of naysaying that would have kept oversampling (which you yourself advocate) from being advanced. BTW, OVERSAMPLING refers to keeping the SNR good and improving the linearity of the frequency response, not with bitdepth (aka dynamic range). You're thinking of Dithering (which is good also, and only the better DAWs had it at first, now almost all have jumped on the bandwagon of it's benefits).
All those changes (even dithering, oversampling) are in the purest sense Destructive, and mixing and DSP all add to that. 6 max, hahahahaha! I do more than that (processing) for even pedestrian radio spots...
Tellarc (sic) recordings...
I've got a number of those as well, and yes they sound good, but I can tell that their A-to-D which was State-of-the-Art then is not so hot. You can hear that "grittyness" that I was referring to earlier. Other Analog recordings sound better. Why? Because they're analog and thus have fundamentally "limitless" or "infinite" bit depth and frequency response (although the equipment and their inherent noise precludes them from having such "functionally" or practically-speaking capability).
20-20kHz capability in only children...
Where did you dream that one up? I, as an audio engineer, get my hearing tested every ~7 years, and up until I was 39-40 had a range that went up to 17kHz in L ear and 19kHz in R ear. An most people under 25 have just as good hearing (unless they listen to their music too loud--BOOM! BOOM! BOOM!).
You can even test your own over the internet, there is a site that has checkboxes for doing nearly the same thing that audiologists do with tones. Boy, you should see some 20-somethings cringe when I turn on the loud "ultrasonic" tone--you'd think they were about to PUKE! Yet, now I can no longer hear it, except as a "presence".
I think you'd better remove the LOG in your own eyes before you remark about the speck in others' eyes...
Originally Posted by yahoobuckaroo
I will, however, say that framerates increases and standardization is needed. I'm sick of watching the "better quality" HD channel that has a jerky version of the same show that looks fine on my "obsolete" non-HD tv.
Cornucopia, there is truth to audio fall-off at certain frequencies due to age, and it starts at about 20 years old. It's a medical fact. I have reason to know this. (Note -- I can still hear many pitches just fine, including the ones "old people" usually cannot.) BOOM!BOOM! is not required. In many cases, all the BOOM!BOOM! does is add sounds (tinnitus), not reduce/remove them.
I'm almost 35 and still listen my music loud! Boom Boom Boom....
Anyway, indeed we need a standard that IMO will not be flexible (like DVB for example). A worldwide standard with a specific framesize, framerate and audio.
If a standard has alternatives, we end up with "choices" based not on what is better, but what is cheaper for the broadcaster!
The majority of people, don't care for quality, but for quantity. That's the whole problem.
"Cheaper" is a catastrophe for all video / audio related stuff. Even compression is a catastrophe from a point and beyond.
Of course, we never gonna see worldwide solutions. There is a huge market out there based on format convertions. Don't mention that the big ones, want the markets separated or in zones!La Linea by Osvaldo Cavandoli
This topic has raised some interesting points - I'm looking to buy a new 1080P TV and have been looking at various models which claim to be able to show 24FPS movies. Trouble is - they all seem to have 50 or 100Hz refresh rates, and as this is not divisible by 24 (as I understand this problem) this means there is some sort of pulldown effect going on. You can see the effects of this with panning shots and quite easily on the end titles - you get that juddering going on, not something you want to happen after splashing out nearly £1k on a TV.
I thought I'd have to buy a 120Hz TV to rid of this (5:5 pulldown) - but they are rare as rocking horse shit here in the UK!
Aren't all blu-ray discs 24FPS then? Regardless of whichever region you are in?
People are saying Hz & FPS aren't linked - but surely the refresh rate of a TV needs to be a multiple of the FPS otherwise you get the judder effect? Or am I such a noob I ought to spend more hours on Google?
Originally Posted by me-eyes
Some players add "true" 1080p 24 fps output. The progressive HDTV must support direct 24 fps input. The TV will then apply various proprietary techniques to generate a display. Among the options are
- frame repeat 2x to 48 fps
- frame repeat 3x to 72 fps (see Pioneer Kuro http://www.pioneer.eu/eur/products/62/63/tech/PlasmaTV/24FPS.html )
- frame repeat 4x to 96 fps
- frame repeat 5x to 120 fps
"NTSC" BluRay models slow 24 fps to 23.976 fps. A "120 Hz" HDTV can then frame repeat 5x to 119.88 fps. Conveniently 119.88 is also 4x 29.97 and 2x 59.94 so all inputs display at 119.88 fps.
Instead of simple frame repeat, advanced displays interpolate intermediate frames through motion analysis.
Most HDTV sets refresh at a multiple of frame rate (e.g. 59.94 or 119.88Hz for "NTSC", 50 or 100Hz for "PAL")
It doesn't matter how many times you repeat each frame, 24 fps, ie, seeing only 24 different images per second, is jerky. So even if you repeat each frame 5 times on a 120 fps display it will be jerky. To get smooth motion you have to motion interpolate intermediate frames. Sometimes this works and sometimes you get visible errors and distortions.
When the repeat rate isn't a multiple of the frame rate you get judder in addition to the jerkiness. For example, 3,2 frame repetition (24 fps film to 60 Hz TV) adds another layer of judder on top of the inherent jerkiness of 24 fps film.
No, 24 fps is jerky. Period.
Tobberian - American/Japanese video doesn't have that "chipmunk" problem. That only happens in places that use multiples of 25 fps, like, oh I don't know, SWEDEN
There are plenty of good reasons to dislike Americans (I'm one, so I'm allowed to say that), but speeding up the audio and video isn't one of them.
Sigh. I didn't realize there was a Page 2, hence my out of date reply to something Tobberian said a week ago. My bad. Sorry.
Originally Posted by jagabo
Originally Posted by Tobberian
Originally Posted by Tobberian
- Heavy tripods or Steadycam
- Minimal pans and then only at math calculated rates.
- Avoidance of zooms. If a zoom, dolly or truck is required, it is often computer controlled to respect 24p.
- Use of short depth of field lens to keep background blurred thus avoiding encoding artifacts and allowing lowered bit rate.
- Action footage is shot from angles that minimize 24p motion stepping.
- Most action scenes for movies or TV series are post crafted frame by frame for encoding efficiency.
All this contributes to the "film look" but this is done for surrealistic "art" not realism.
The history of 24p is interesting. Before sound any frame rate could be used. Sync sound required a fixed frame rate. In the 1930's tests were run to determine the minimum frame and flicker rate that a cinema audience would tolerate. This was done to save on film stock costs. People stopped vomiting at around 24 fps with a projector gate that repeated frames 2x or 3x (i.e. 48 or 72 Hz refresh). Production technique minimized the worse 24p artifacts and audiences adapted to 24 fps.
When color TV started in the 1950's, most programming consisted of telecined film or live to air. There were no video recorders other than "kinoscope" film cameras. As demands for international program distribution grew in the 1960's as much of the world went PAL, it was determined that 24p could be extended to all world markets using the 2:3 and 2:2 pull-down tricks. By then more programs were distributed on NTSC or PAL 2" Quadruplex videotape rather than film.
As a result of all this, 24p was locked in as a world production standard even though it remains the minimal acceptable frame rate for viewing. As home screens grow larger, the limitations of 24p are easier to notice especially for amateur productions. Live production motion update remains 50 or 59.94 rate while display refresh rates are moving up to 100 or 119.88 Hz for large screens.