Sorry for my bad english. Hi there from Brazil!
Ok, so I spent all this money on a 52" 1080p TV, and a 2.0 BLU-RAY player.
Lets leave sound apart from this.
First of all, what was wrong with PAL / NTSC standards? Resolution was not good enough but that wasn't all.
Frame rates were an issue too. PAL with 25FPS and NTSC with 30FPS.
The current HD standard for movies is something about 1080p / 24. This means 24FPS only.
Looks like it is only solving the resolution issue, but it is not doing anything about the FPS issue.
With PAL and NTSC, the frames were, of course, smaller compared to 1080p, but they were also no where near as sharp as 1080p frames. Blur is the word for a PAL / NTSC frame when compared to a 1080p frame today.
With those "blur" frames (compared to 1080p) on PAL and NTSC, transitions from frame to frame felt more smooth even with 25 and 30FPS, because your eyes couldn't detect much difference from one frame to another, because of the blur patter of the frames. So the movie was still going on 25 or 30FPS, but in your head, you felt it smooth. Of course this breaks down on high speed scenes.
Now, the rule is, assuming you won't change the FPS, the sharpen the frame, the more "laggy" the film will look like. That is because on a very sharp picture like current 1080p, when one frame moves to another, your eyes can tell a LOT of difference between the frames, and this means you can detect on large scales the scene change, meaning you can really feel every frame that is changing.
The idea of frame rates, is that they will only look smooth, when you can't detect anymore the frames changing from one to another with you eyes / brain, so in your head, you will assume there is something continuous going on.
But what number of frame rate is enough to fool us arround? That will change from one to another. You need to forget this stupid idea that humans can't see more then 30 or 60FPS. I can easily see a big difference from 60 to 120FPS for example. If you have a CRT monitor, just change the refresh rate from 60 to 120Hz, and see if you can tell the difference. I am sure you can too.
When talking about small FPS numbers (20 - 60), the sharpen the picture, the more FPS you will need to make it feel continuous. Of course, after some 120FPS or 200FPS, your brain couldn't care less about the frame, since it is fast enough to fool you arround with pretty much any type of frame.
So my point is, the current HD standard of 1080p / 24FPS is a joke. Because now movies look more "laggy" then NTSC and PAL movies, because of two things:
1) 24FPS < 25 / 30FPS (I know there is some pulldown going on for NTSC movies, but forget about it now)
2) 1080p has much sharpen frames, and that makes it more "laggy" under slow 24FPS.
I know interlaced material can make it look more smooth, but movies are pretty much progressive, so forget about it too.
Now when I turn on my 1080p TV and bluray player, I can enjoy the much better picture quality (no doubt about that), but on the other hand, the movie just looks "laggy", because of the reasons I mentioned before.
This is a fault in current HD standards, because the hardware can't handle 1080p with 60FPS for example, because of processing power, but more important, because the hardware and the optical tech can't hanlde the bitrate needed for a 1080p / 60FPS progressive, if you were to keep every frame at the same quality as current 24FPS frames.
But the other part of the problem, is that cinema is still filming with their stupid 24FPS cameras.
24FPS for cinema is something that is going on since forever, and that should have been updated by now.
Cinema really needs to start filming with 60FPS progressive cameras.
Anyway, this is just my view of it.
Current HD movies looks "laggy" as hell with only 24FPS of progressive frames going on.
You can easily see that, on fast action scenes.
I wouldn't call it HD at all. HD would be some 1080p / 60, keeping the quality of every frame like the 24FPS frame. But for that, you would need 2.5 more bitrate, and current HD hardware can't handle it, so it is a fail.
Assuming cinema starts to film with 60FPS tomorrow, the current HD hardware wouldn't be able to handle its content, without decresing the quality of the frame to keep bitrate under 50Mbit/s or so.
This means, in this case, your $3000,00 setup is useless, and if this kinda update on frame rates happen, you will need to buy something else.
That is why, current HD is a fail.
+ Reply to Thread
Results 1 to 30 of 79
Thread: Current HD standard is a FAIL
Here, here! I personally don't want to watch everything like the jerky Saving Private Ryan-style, it just looks dumb. I'm all for technology, don't get me wrong, but what we consumers are dished out is obsolete rubbish. Blu-Ray's a joke I will never touch, nor anyone I know at work & out of work. Same with the LCD/plasma fiasco. LED/OLED/SED/whatever looks good but it's years away. So I'm sticking with DVD and CRT!
No, you can't see the difference in framerates. What you can see, however, are the errors caused by having the slow framerates.
It's sort of like astronomy. We can't actually see the exosolar planets, but we can observe the affect it has on other bodies that we can see (stars).
Originally Posted by lordsmurf
The fact is, current HD is a joke, and LAGGY.
The reason why current HD is more "laggy" then SD, is that the motion-blur applied on HD movies is just not as efective as the motion-blur applied on SD movies. Why? Because the HD frames are too damn sharp to begin with, so even the motion-blur used there, is not enough.
And the fact that HD is 24FPS and SD is 25 / 30.
25FPS is some 4% more then 24FPS
30FPS is some 20% more then 24FPS
Yeah I know the pulldown, forget about it for a moment.
But this problem starts with cinema using 24FPS cameras. I know. But the current HD standard is not smart at all, and it doesn't see that in near future, we might see some cinema with 48fps or more. If that happens, your HD setup becomes a door-stop.
Now, if current HD were to be compliant with some 120Mbits/s, then my friends, a simple firmware update on your hardware would make it ready for the future, meaning it would be compliant with some 48fps cinema or even 60fps.
Why HD fails? because it only works up to 50Mbits/s or so. This is the problem. Current HD fails to keep track of the possible change in cinema FPS, meaning if there is a change there, your setup becomes a door-stop, and you will have to buy a new $3000,00 setup.
Now, we all agree 24FPS in not enough (is a joke in fact). So isn't it reasonable to think that it might change in the near future? This is the smart way to see the current situation, and current HD just don't keep track of that.
Motion blur is not applied to movies. It is inherent in the physical process and happens at the time of filming, not after the fact.
All tech is obsolete long before it hits the shelves for the consumer to buy. If you want to keep waiting, do so. Cinema isn't going to change it's fundamentals approach in a hurry, and there is over 100 years of history sitting in the libraries - hundreds of thousands of films that were all shot at 24 fps.
Originally Posted by guns1inger
And no, I don't want to wait, if you read my 1st post, I said I bought a full HD setup.
But I will not fool myself just because I own a setup like that, believing it is a very good thing. It is not. Movies are laggy as hell in blu-rays. Frames look awesome, but movie is laggy, so it is far from being good.
You need to learn more about the fundamentals of photography, both still and motion. What you're saying is more or less non-sense here.
Faster framerates and better cameras are needed, yes. We can agree on that.
Originally Posted by lordsmurf
Don't we all agree that HD is laggy? That 24FPS is not enough? I guess this is the big point. And really it is more cinema's fault then the current HD standard. The problem with current HD standard is that it won't be compliant with some upgrade in FPS if it is going to happen in cinema. And I think it is very reasonable to think of that, since 24FPS is indeed a JOKE.
As I write here, I am looking at my 52" 1080P TV, running the dark knight blu-ray, and it is just sad how jerky the movie is. It is pretty no doubt, but laggy as hell.
Other good way to see it, is to download some 1080P traillers here.
It is unbelievable how laggy those traillers are. This is a real bad joke.
The motion blur is exactly that of the film itself. It is not better or worse on DVD, and the extra detail does not make it do. Cinema is not as smooth as we might like, but making HD compatible with standards that don't exist yet simply results in no standards, competing formats, and everybody loses.
Would I rather watch Casablanca on Bluray or DVD ? DVD doesn't stand up any more. My Dark Knight Bluray is no jerkier than than what I saw in the cinema, but the overall image quality is better than what most local cinemas can project (and no *******s talking in the seat behind me, either).
Surely you would have seen the same "laggy" effect (perhaps you mean jerky/ unsmooth?) in a cinema. at 24 fps. with resolution effectively 3000 or more lines?
Anyway, it's not the data on the disc that's inadequate, but the player not using it to create a smoother experience.
It should be possible to interpolate and make an acceptable 120 fps stream, for instance.
I just watch my movies on a humble CRT and have no desire to "upgrade" until it's unavoidable.
I watch movies and TV shows for the plots, scripts, and acting, not to obsess over pixels.
A bigger screen makes the inherent jerkiness of 24 fps more obvious. The upscaling fuzziness of SD material does raise the jerkiness threshold (how big the motions have to be before you really start to notice the jerkiness) a bit. But this is not motion blur, it's scaling blur.
Try watching some live 720p (60 Hz) sports broadcasts. They're nice and smooth.
Is your 52" 1080p TV LCD or Plasma? What is its Make/model number?
I think what you refer to as "laggy" is often called "judder" or "stutter", or just jerky. If so, an option you might want to have a look at is a TV with motion compensated frame interpolation; most manufacturers have it as a feature on their higher end models these days, under various names. On my Samsung LN40A650, it's called AMP (Auto Motion Plus).
I was in the market for a new LCD TV a couple months ago. Unfortunately, I had to return the set I bought (which had pretty good reviews) due to excessive judder/jerkiness. It was driving me nuts, so I took it back the next day. I hadn't observed this problem on my old 2005 model 27" Westinghouse 720p LCD set. Looking at other models at the store, they all seemed to be just as juddery.
I did a little research, went to Best Buy and watched two Samsung sets side-by-side, with and without AMP, and was sold. (I later found out that the Samsung has a feature where you can use AMP on the left side of the screen, no AMP on the right (or vice-versa, I forget) to see the effect on whatever source you want to try it on).
The process interpolates additional frames to smooth out motion. Some people don't like the effect, which, depending on the AMP level and the source, can make the video look "too real", and is sometimes referred to as a soap-opera effect. Me, I kind of like it, and usually leave AMP at about medium. The newer Samsung LNxxB650 models, which replaced the model I bought in March, supposedly offer a finer level of control over the AMP, but I haven't played with one.
One artifact of the process is an occasional halo, sometimes called "vapors", around certain moving objects. You see it sometimes when an object moves in front of a highly detailed background. First time I noticed it was when a character in "Lost" was walking through the jungle, for instance. But you won't see it very often.
For me, it's definitely a net gain. Some of us may just be over-sensitive to judder. I'm afraid I'd have had to stick with the old Westinghouse if I hadn't found this. It may or may not be right for you, buy it's worth checking out if you haven't already looked at it.
An informative link:
(Opens with a discussion of Casablanca on Bluray, coincidentally. A lot of good info in the comments section)
Even lower-end manufacturers are starting to include this feature. Here's a review of a new Vizio model with a nice explanation of what Vizio refers to as MEMC (motion estimation, motion compensation) Smooth Motion Effect:
Thanks very much for the info. I will look into that right now.
I believe my TV just don't have this feature. Maybe I should have reshearched a little more before I got my TV. And yes, on this 52", the stutter of the 24FPS playback is not acceptable for me. Looking into it right now, and I will post back again.
Thanks very much for your help.
I am looking into the web, and yes, TV manufactors are introducing a feature to compensate the jerkyness of 24FPS. I guess a lot of people must have been complaining about it. Really, on a 52" TV, it is unacceptable.
I don't really know what it is, but it could be something about interpolating and creating a "virtual frame" (with information of the previous / next frame) that goes in between two real frames. So the motion of objects on the screen feels smoother.
They have different names for it.
Sony - MotionFlow
Samsung - Auto Motion Plus 120 Hz
LG - TruMotion 240Hz
And so on...
My TV is a SAMSUNG 52" LCD FULL HD MODEL: LN52A610A1R
This model DOES NOT have the Auto Motion Plus 120 Hz feature.
There is a model from SAMSUNG, the LN52A650, witch is the exact same TV as mine, but WITH the Auto Motion Plus 120 Hz feature.
I bought my TV this weekend, so I hope I can work something out on the store I bought it. I hope I can exchange it, will try to do it, and if everything turns out ok, I will post impressions on new TV.
Again, thanks for the help, and if you are reading this and looking for a HD TV, don't repeat my mistake, go for the TV's with features to compensate the jerkyness of 24FPS, unless you don't mind the jerkyness.
Could it be an LCD vs Plasma thing? I have a 50" Plasma and nothing seems out of the ordinary when watching various content.
Originally Posted by Phlexor
Notice how much jerkier the 24 (to) and 30 (middle) fps sections are.
In the 1980's, special effects wizard Douglas Trumbull ("Brainstorm", "2001") brought to market a 70mm 5 perf 60 fps process called Showscan.
Described by those that saw it as the most perfect image ever projected on a movie screen, Showscan was noted for its more realistic depiction of motion. Production and exhibition companies took a pass and Showscan, the company, liquidated in 2001.
When the early motion picture companies were standardizing the frame rate, they picked 24fps as the lowest rate (to save money) that audiences wouldn't object to. After years of hand-cranked 9 - 15 fps, 24fps must have looked pretty good.
You aren't going to move the film community off 24fps. They consider the incremental motion part of the art that gives the "film look". Shooting 24fps requires special training (film school) and special stabilized cameras.
The video community shoots 50 to 60 motion increments per second allowing natural motion with hand held cameras.
Film is accommodated in HDTV exactly the same way it is in SDTV. NTSC uses 2:3 telecine to match frame rates*. This process introduces additional "judder" due to the 232323 sequence. The 120Hz HD sets get rid of "judder" but still motion increment at 24fps.** Frame interpolation adds intermediate frames allowing pseudo 48fps 72fps 96fps or 120fps motion smoothness but at the expense of interpolation artifacts. Each generation of this technology reduces the more obvious artifacts.
* PAL speeds 24fps film 4% to 25fps. 100Hz TV sets frame repeat 4x for less flicker. Some 100Hz HDTV sets add the frame interpolation feature for 96Hz or 100Hz motion display.
** Blu-Ray players "normally" connect to an NTSC HDTV with telecined 1080i/29.97 or 720p/59.94. This connection will include the 2:3 "Judder" which may or may not be processed out in the TV. Some newer HDTV sets allow Blu-Ray connection at native 24 (23.976) fps. This removes the telecine judder and provides a better foundation for frame interpolation.Recommends: Kiva.org - Loans that change lives.
Originally Posted by Squash
Yes, he was referring to that effect of skipping frames in action scenes that doesn't play as smooth as they should
LCD and CRT are totally different, there is no refresh rate of 100Hz or 120Hz in LCD panels
LCD panels are progressive by nature, and when it says 100Hz or 200Hz it is not referring to refresh rates but rather frame interpolation
For example, in a soccer game if the ball moves 10 pixels from frame 1 to frame 3, the LCD processor will insert extra frames between 1-2 and 2-3, where the ball moves 5 pixels
Samsung, for example, introduced Motion Plus, other Game Mode, and etc...that minimize that effect
Plasma panels doesn't have that problem, the respond time is much higher, and high end series like Pionner's KURU has the most surprising image without jerkiness, and pure blacks, just perfect
But the last generation of LCDs, like LED-LCD is getting closer to Plasmas, in fact Pioneer gave up of plasmas manufacturing, the Plasmas are near to death because the market shows that for 1 plasma sold there is 8 LCD. Pioneer engineers moved to Panasonic
LCDs are getting thinner, fastest, cheaper, bigger......with better responses, better blacks, etc...etc...
I heard that in the near future they will have more resolution than 1080p
The simple reason for HD motion problems is flat panel display manufacturers completely ignoring any basic knowledge about TV technology.
It's not difficult however, and I've explained this here already: http://www.codecpage.com/index_motion.html.
The original poster is right in a central point: motion blur is making most current HD sets obsolete. An expensive flat screen simply makes no sense if it works for stills only.
The OP is talking about the inherent jerkiness of 24 fps film and how it's more obvious on a big screen.
Guys, I have good news.
I will be able to exchange my TV. The store will receive it back, and I will be getting the new samsung 52" 1080p with the AMP (Auto Motion Plus 120Hz).
I will come back later to post if it solves or not the jarkyness problem of stupid low 24fps on bluerays. Hope so.
On my 52" the jerkyness was intolerable, and I am pretty sure all of you would agree, with you were looking at my TV with me.
I hope this AMP 120Hz feature can solve the problem on big screens. Lets see.
jagabo, yes I was talking about that, but also, about low 24fps running with very sharp images, crystal clear frames, as current 1080p bluerays.
The fact that the image (frame) on a 1080p film in a bluray is SO SHARP, crystal clear, will make the jerkyness worst. The sharpen the picture the more fps you will need up to a point, to fool your eyes.
With DVD, you could run close to 24fps and kinda fool your brain that something continuous was going on, because the image (frame) on DVD is kinda blur when compared HQ. So DVD wasn't SO jerky.
Don't get me wrong, cinema was always 24fps, but with blueray's SHARP picture, crystal clear frames, it looks jerkier then DVD.
I hope my new TV will solve the issue, I will post back.
Very nice read. Thanks. I agree with all that.
Originally Posted by edDV
Originally Posted by edDV
Look, we need to stop defending 24FPS. If it were to change to 48fps tomorrow, than a few years from now, we would find the idea of playing 24fps ridiculous because now we are used to 48fps, and 24fps is just to jerky. So please, there is nothing special about 24fps, and nothing "art related" to it. Cinema was born with 24fps (actually lower than that), not because this is a magic number or "art related" number, it was just because it would be to expensive to do something else, and the technology was up to that at the time. There is NOTHING special about 24fps. We are just used to it. I hate the fact that people are so afraid of changing. Don't be afraid people. Lets stand for the facts. 24fps is just to stupidly jerky on HD, it will change FOR SURE in the future. I don't know if the future is 5 years from now, 20 years or 100 years, but it will change for sure. Don't be afraid of it, it will be a good thing, don't cry for it ok?
If you were to play a game like DooM or Quake or CS at 24FPS you would FOR SURE say it sucks. But now you watch a movie in 24FPS and defend it? 24 is not enough.
I hate 24fps and grain,the film industry needs to get in the 21st century.Several studies were done and they concluded 28+fps eliminates judder.
The artificial grain in the film "300" looks rediculous to me and it's annoying.
What is up with that grain in 300?
I saw that in 1080P and it looked so crazy noisy. They did too much of it, that was totaly crazy.