Most of game video makers these days don't have problems recording their games even at 1080p resolution since the games they are running mostly are rated at 60 or 30 FPS. However, when one records a game at 300 FPS, and then converts it with a codec to 30 FPS, a problem occuors - the converted video feels a bit choppy.. It's like that "my eyes hurt watching it feeling".
Is there any advice to make the best avi video from raw game videos with high FPS rate? Pehaps choose 50 FPS instead of 30 to reduce that choppyness or perhaps it's all about proper video codec?
Which codec would you choose to encode raw 1920x1080 high FPS rated game video files?
+ Reply to Thread
Results 1 to 28 of 28
-
-
AFAIK, "avi video" is a very-generic expression, and the AVI container itself has no superior limit to the framerate value.
Anyway, simply DON'T capture video games at framerates higher than 60fps, case solved
And regarding YouTube especifically, its maximum framerate is 30fps, if I am not mistaken.
So if your goal is upload to YouTube for example, then the best approach is capture at 30fps, and forget framerate conversions. -
Damn, you are totally clue less and don't have any idea what are you speaking about.. There are games that are meant to be run at over 100 FPS, because 60 FPS is just too damn slow! And there are situations in games, where as much as 300 FPS are needed in order to perform difficult trickjumps, like shortcuts, where one needs to climb a steep wall..
-
Games are not needed to run at 300fps to jump or climb walls,that just plain bs,30-60fps is enough as long as the system maintains it,name one game where you need 300fps.
I think,therefore i am a hamster. -
want to talk about totally clueless? when a game is running at a given frame rate, say 200 fps, it doesn't mean that 200 frame per second are being displayed, it means that the game engine is rendering 200 frames per second, the actual number of frames displayed is limited by the refresh rate of the monitor. what happens is that the excess frames past what the monitor is capable of displaying are buffered first to the front buffer (which makes up half of the advertised ram on a video card) and then to the back buffer, which makes up the other half. eventually what happens is that the buffers get full and they need to get flushed but you never notice that most of the rendered frames were discarded because you are already past the point where the buffered frames would be needed.
the notion that any game "needs" to run at a given frame rate in order to perform some task is idiotic beyond belief. -
Derr louie!!!!!!
HELP!!!
I tried to think, and my brain hurts!!!!!!!
LOL!!! -
That's what i was trying to tell you actually...
I need the game running at 200 FPS in order to capture some material...
You know nothing.
Take a look at this vid for example:
http://www.youtube.com/watch?v=AUJAKmhNqtY
This is Painkiller. A game that needs 120 FPS for constant smooth gaming experience, and over 200 FPS for climbing the walls. 100 FPS is the minimum requirement for competitive 1 vs 1 multiplayer gaming. It was run at 125 FPS in CPL 2005 World Tour - standard settings. Players would feel lag at 60 FPS, but i guess you don't understand that, since you are not familiar with game physics and movement system.
Next time don't make a fool of yourself by saying "games are fine at 60 FPS", certainly not all are fine at 60 FPS.
Tell that to all the Painkiller community who play that game - i guess all of them are idiots for setting 120 - 300 FPS in the game?Last edited by Artas1984; 17th Jul 2013 at 18:14.
-
If you are re-encoding 300 fps second video and reduce the frame rate, motion is going to be less smooth than the original. There is no way to change that. You cannot reduce the frame rate without throwing some frames away, which makes motion less smooth. I have never heard of an encoder that is smart enough to choose the optimum frames to keep.
If motion looks smoother on a display at 30 frames per second because of what a video card does than it does when playing back the file you re-encoded from raw video, you could try capturing the video game at 30 frames per second instead of re-encoding raw video for it.
[Edit]There is a process called "frame blending", available with software like Adobe After Effects, which I have seen used to produce slow-motion effects, or smooth out time-lapse photography. I don't know enough about it to say if it could help with your problem too.Last edited by usually_quiet; 17th Jul 2013 at 15:38. Reason: clarity
-
Just send the name of the O.P. to your Ignore List, case solved
If he wants to capture a Hi-Def game at 300fps, fine. If he wants to see no difference between the capture framerate and the gameplay framerate, it's even better -
-
-
As jagabo already said, the high frame rate is not the problem. The problem is that if your video really is 300 frames per second, then 9 out of ten frames have to be thrown away to reduce the number of frames shown per second to 30. Encoders aren't made to analyze the video to figure out which combination of frames to keep to produce the smoothest possible motion. What they can do is frame decimate, which means keep one frame, throw out the next 9, keep one frame, throw out the next 9, and on and on and on...
-
-
yes, most hardcore gamers don't know what the **** they are doing, they lack even the most basic understanding of how computer hardware works. that's why you get so many retarded questions related to driver issues on gaming boards.
let me make this as clear as i possibly can, it doesn't matter how fast a game engine is internally rendering frames, the maximum speed that said frames can be displayed is limited by the refresh rate of the monitor.
now i'm going to answer your question, which if you weren't so busy being a jerk off, telling people that the are clueless about some half-assed game du jour, you would have figured out by now.
since the displayed frames per second is limited by the refresh rate of the monitor then you should be capturing at a frame rate equal to the display rate, assuming the following 2 conditions are true:
1) you are doing a screen capture.
2) the game engine is running fast enough so that the instantaneous frame rate is at least equal to the monitor refresh rate.
so to make this simple for you, if you are certain that said game is running at an instantaneous frame rate of at least 75 frames per second but your monitor is set at 60hz, then you should capture at, can you guess the correct answer?
if you want to raise your hand and say "75 frames per second!!!", then you are a dumb ass and should not be allowed anywhere near a computer, ever again. -
yeah, that's the feeling i got, but in all fairness you would be surprised to learn that most people are not aware that max possible displayed fps = monitor refresh rate.
and most will argue with you that their eyes can actually see 300 individual frames per second despite the fact that numerous tests show that the for the average person at 60 fps things start becoming a blur and even for elite fighter pilots that have years of training they max out at about 120 fps before things become a complete blur. -
I am not gamer,so what is going on here? Maybe 120 fps switch means engine renderer, game becomes 2xfaster, therefore smoother in their eyes, giving out still 60fps on screen? Some marketing gimmick yet again ? Artas1984, is game faster with that higher fps?
-
My god, i ain't that stupid... Why do you need to bring this up? I never said anything that would imply that i do not know the difference between game engine frames and rendered frames on monitor.. I am being harassed jut out of speculation.?. Not what i expected in this kind of forum...
You've been reported. Everyone who plays that game and tries to record something sets the engine frame rate at over 120 FPS. I guess you just like to visit posts and insult people randomly, cause that makes you feel good?
The deal is this: with high FPS i am able to skip more polygons, what makes me able to climb on textured surfaces faster. The game does not become faster by itself.
So i need the engine rendering 200 - 300 FPS. There are folks in this forum who don't understand that and think i am an idiot for doing that (needing 300 FPS)... A recording program must also capture the frames at 300 FPS, and then that raw recorded file will be converted to 30 FPS.
Thank you for not insulting me, i hope i explained why i need to capture the video at very high frame rate...
I never said anything here so stupid as to human eye watching over 60 FPS or that my monitor can display over 60 FPS..
WHY YOU KEEP IMAGINING THINGS I WAS NOT EVEN TALKING. WHAT KIND OF HARASSMENT IS THIS?
The discussion is only about a suitable video codec for encoding raw video files with 200 - 300 FPS cap to comporessed at 30 FPS. Don't bring me into your BS talk about what you "imagine"... It never was about monitors or eyes....Last edited by Artas1984; 21st Jul 2013 at 17:32.
-
I do not quite understand that, I'd ask again, is the game faster then?
edit after your edit: to convert higher fps to 30fps was explained here, no easy task, you have a choice to skip frames or interpolate, or to blur and interpolate them just right (to compensate shutter speed) something like with QTGMC but that is to get 30p out of 60p, nothing like 300fps to 30fps, think of it how do you want to put spatial and temporal information from ten frames into one.Last edited by _Al_; 21st Jul 2013 at 17:53.
-
It's not the job of a codec. A codec just compresses or decompresses
You're looking for processing filters, such as adding motion blur, frame averaging techiques
How are you recording the 200-300 FPS now ? What hardware and/or software ?
1st thing I would do is make sure you're capping 200-300fps properly . It's not trivial - you need a beefy system and adequate disk i/o (Raid-0 or SSD's) unless you are using some type of compression . Otherwise you will frame drops. Sporadic frame drops are bad, because your frames will not be evenly spaced (jerks in motion)
While there might be a reason to play at high FPS, there is no reason to RECORD at 200-300 FPS if your desired output format is only 30fps or 60fps - unless you want slow motion or doing some special editing. Recording at a lower FPS will minimize the frequency of frame drops because of fewer resources. Also, the higher the recording settings, the more negative impact the on in game FPS . (ie. the higher the recording settings and FPS, the more laggy the game will feel) -
It is not faster in terms of movement speed, it just becomes smoother - you can move with greater precision, don't get stuck, gain fast acceleration.. One needs to feel to understand. There are quite a few games like this who by default should be run at over 100 FPS. Sad, that some people think that 60 FPS is enough for all games and insult others for thinking that..
-
-
This is true. There are perceptible control differences with the mouse (if you have adequate mouse)
But the difference between 60 vs 120 FPS (even on a 60Hz or 120Hz monitor) is slight even on these type of games. But I doubt anyone can discern >120FPS , let alone 200+ FPS.
The most important parameter is actually the MINIMUM FPS. The lowest dip in FPS that is actually achieved on that system is way way way more important than what you set the FPS of any game to . Many competitive players & pros actually turn down the "eye candy" settings to achieve higher minimum FPS -
You still haven't said what refresh rate your monitor is running. No matter how fast the game is rendering, if you're running on a 60 Hz monitor, you're only seeing 60 frames per second. And your 300 fps screen caps only contain 60 different frames per second. You can easily verify that by opening the cap with an editor and stepping through it frame by frame. You'll see 60 different frames, each repeated 5 times, every second.
That said, I can see a scenario where 300 fps is more playable, even on a 60 Hz monitor: with the game making more in-between (visible frames) calculations it might have better collision detection. Of course, a well written game will do this even if it's not rendering all those in-between frames. But some may not.Last edited by jagabo; 21st Jul 2013 at 18:37.
-
if the recording program is doing a screen capture, i.e. it's capturing what is being displayed on the screen, then you must capture at a frame rate that equals the monitor refresh rate assuming that that game engine is running at a rate that exceeds the monitor refresh rate. i don't know how much clearer i can make it for you.
if you are capturing with a program that actually records what the game engine is rendering then you must capture at a rate that matches the internal instantaneous rate of change of the game engine.
that is to say, if you're game is running internally at 200-300 frames per second, you do not capture at 300 frames per second, you must capture at the minimum rate that matches what the game engine is delivering each and every single second, it's silly to capture at 300 frames per second just because the game engine is averaging 300 frames per second, in order to achieve that average the instantaneous rate has to be higher at some points and lower at other points, you need to find the low number and set that as the capture rate.
does the above make sense to you?
he deal is this: with high FPS i am able to skip more polygons, what makes me able to climb on textured surfaces faster. The game does not become faster by itself.
So i need the engine rendering 200 - 300 FPS. There are folks in this forum who don't understand that and think i am an idiot for doing that (needing 300 FPS)...
i can assure you that no programmer would ever code a game engine like that because then it would marginalize a large portion of his customers that don't have setups capable of achieving such frame rates.
in fact, one of my past times is creating cheesy first person shooters using dark basic ( i hope to finally get around to releasing a finished product late this year) and i can't even imagine any developer actually adding code to his game engine that limits the jump height by the frame rate, in fact if such a thing actually exists in some game engine. like painkiller, then you should report that to the developer because it has to be a bug. -
It's not about frames, but about motion blur. The human eyes don't operate on framerates, but when you're tracking eyes, shorter frames are better (either by flicker like CRT, or by adding more Hz for a flickerfree display). Flicker is a method of reducing motion blur without needing to increase framerate (both methods shortens the length of time an individual frame is displayed). Most motion blur on a modern LCD is because of the sample-and-hold, rather than from the LCD's pixel speed limitations now.
Animation of sample-and-hold motion blur (view TestUFO links in Google Chrome for perfect smooth animations).
www.testufo.com/#test=eyetracking
This demonstrates eye tracking motion blur caused by sample-and-hold, which will occur even on fast-pixel-transitions. Increasing Hz (or doing strobing) reduces this type of blur.
Animation of moving photo creating motion blur on most LCD displays
1. Moving photo www.testufo.com/#test=photo
2. Stationary photo www.testufo.com/#test=photo&pps=0
On a 60Hz LCD, there's 16 pixels of motion blur difference between the two
On a 120Hz LCD, there's 8 pixels of motion blur difference between the two
On a 240Hz LCD (e.g. Sony Motionflow), there's only 4 pixels of motion blur difference
On short-flicker displays (e.g. CRT or LightBoost), there's so little blur that (1) and (2) looks exactly the same clarity, since the flicker is avoiding the eye-tracking-based motion blur.
Animation of black-frame insertion (strobing to reduce blur)
www.testufo.com/#test=blackframes
Comparision of framerates 30fps vs 60fps
www.testufo.com/#test=framerates
In high end vision scientific test cases of the best-case scenarios there were situations where a human easily could tell 500fps@500fps was more fluid because of motion blur effects. Vpixx.com sells a 500Hz capable projector, used for vision research, and elsewhere, there are some monochrome DLP's capable of 1000Hz (at 1 bit per pixel), used for vision researcher stuff. There are always side effects to using a discrete framerate to represent moving objects, since eye tracking is analog. When pixel speed is not the limiting factor, mathematically: 1 millisecond of sample-and-hold translates to 1 pixel of motion blur for every 1000 pixels/second of panning motion -- similiar to www.testufo.com/#test=photo ... The faster the motion, the more motion blur there is on a sample-and-hold display such as LCD.
Anyway fast pans in videogames are a torture test cases -- there's crisp edges, sharp graphics, no video filtering -- so it's easier to see differences between ultrahigh framerates, than it is with video.Last edited by Mark Rejhon; 5th Aug 2013 at 16:26.
Similar Threads
-
1080p MKV Choppy Playback - Solved!
By sdbyrd in forum Software PlayingReplies: 5Last Post: 13th May 2013, 06:29 -
Re-encoding PlayClaw game recording makes the video choppy
By RobbyBobby in forum Video ConversionReplies: 0Last Post: 23rd Dec 2012, 13:13 -
Choppy 1080p playback how can I solve?
By OrigamiProdigy in forum Software PlayingReplies: 1Last Post: 21st Aug 2012, 15:45 -
choppy playback of 1080p content with mpc-hc and dvbviewer
By codemaster in forum Software PlayingReplies: 3Last Post: 5th Jul 2012, 18:25 -
AutoMKV: 1080p or 720p MKV to smaller sized 720p MKV = choppy video!
By Quicky in forum Video ConversionReplies: 1Last Post: 12th Apr 2009, 19:23