I noticed something rather surprising recently. My encode speeds using x264 seem to be a strong function of the bit rate.
Most of my encodes are for web delivery that gets re-encoded anyway. Rather than using 2-pass VBR, I prefer 1-pass CBR close to BR specs like 35 Mbps to maximize quality delivered.
However, I was recently playing around with a project where the only thing I was changing was the CBR of the encode. I encoded it three times: 35 Mbps, 20 Mbps, and 5 Mbps. As the bit rate decreased the fps got increasingly faster. And this was not some minor difference. I didn't write down the speeds, but I want to say that 5 Mbps was like 8x faster than 20 Mbps.
This surprises me because I assumed, naïvely of course, that an encoder has to work harder at low bit rates versus high bit rates. Does anyone have any thoughts on this? Thanks.
+ Reply to Thread
Results 1 to 26 of 26
-
-
You may imagine this like progressive quality: The encoder will start with a rather coarse quantization, and if the result does not yet exceed a bitrate or rate factor limit, it will try again with a finer quantization ... and again ... very simplified. I hope a developer can explain this a little better but still descriptive enough.
-
Well, that makes sense. Maybe another way to visualize it: the encoder acts like a focus ring on a lens, starting with a blurry image and bringing the frame into focus gradually until it bumps up against the bit rate or rate factor limit. Maybe not a perfect analogy but seems like what you have described. Thanks.
-
I too would think that the more you want to compress, with lower bitrates, the harder the encoder has to work, hence it would make logical sense to some that this would increase encode times.
Then again, what can also make sense to others is that less data is being passed through with the lower bitrates, hence faster encode times. If you, for example, encode to a smaller resolution, you'd notice the speed increase due the reduced data being worked.
Maybe both are indeed true, but one advantage is much greater in net result than the setbacks from the disadvantage.
Maybe with recent builds of x264 the compression algorithms have been optimized with lower complexity, so compression isn't so much hard work anymore, and it's just mechanical after with the speed increases resulting from the lower data transfers of lower bitrates.
This is all things being equal, of course, in your other settings.
I haven't encoded in particularly low bitrates in a long time now, but this is something I'd like to play around with.I hate VHS. I always did. -
The only reason I even stumbled across this was because I was trying to find a suitable bit rate to upload a video to facebook which has a 1.8 GB file size limit. It was FullHD content but since I know most people only watch embedded video as is versus expanding out to full screen, I figured, "What is the point of a high bit rate upload?"
But with all that said, it does seem somewhat counter-intuitive that starting from say a 220 Mbps master that an encoder would have an easier time producing a low bit rate version versus a high bit rate version. But it sounds like that is not how the programming works. -
Think of your chosen bit rate as a target for the encoder to reach. If it is given a high target (Low bit rate) then it will work very hard on each frame to find the space savings needed to reach the bit rate you want, whereas if you give it a low target (high bit rate) it will spend less time on each frame because it can simply increase the bit rate and move on instead of carefully analysing the frame for patterns and other space saving characteristics.
Last edited by Trevonn; 22nd Sep 2015 at 15:40. Reason: More detail
-
-
Oops I read the OP completely wrong, yeah I thought it would increase with lower bit rates too
-
I guess with a lower bit rate you're telling the encoder that it can only do so much work per frame so lower bit rate = less work = more fps
-
More bitrate = more data for CABAC to compress.
Btw, for maximum quality over network use --vbv-maxrate and --vbv-bufsize 2-pass encoding. Maxrate at 35000 and buffer size of 35000 (1 sec) or 70000 (2 secunds) should do the trick. You could even reduce bitrate a bit to save bandwidth. -
I am not sure how rigorous this test is. But I decided to encode a 15 sec clip from my project and see how the speed varies as a function of the bit rate. This is the x264 code I used:
Code:x264.exe --bitrate XXXXX --preset slow --level 4.1 --ref 4 --tune grain --crf 17 --vbv-maxrate XXXXX --vbv-bufsize XXXXX -o "video.264" "signpost.avs"
Anyway, I am glad I figured this out. I didn't realize this was a way to speed up my encodes. Every little trick in the book helps
Last edited by SameSelf; 22nd Sep 2015 at 21:11. Reason: minor error
-
Yes, buffer settings could be used to do that together with CRF, .... , trying to imitate CBR (like having steady internet stream etc.) or not wanting bitrate really drop for any scene, CRF is set much lower so bitrate does not drop as much, but maxs are being cut off by setting those buffers appropriately. If there is 1200kbps needed I'd set max to 1000 for example.
-
Only either CBR or CRF mode will be used by x264. And a (more or less) constant bitrate in only 1 pass is suitable for broadcasting over a limited bandwidth, but will result in a quite varying quality (nice during stills, worse during action).
"Haste makes Waste". Decide for yourself whether you prefer to whine about speed today or about loss of quality tomorrow. -
This could be a huge thread on its own.
What was the encoding saying in the older days? Don't quote me 100%, but the traditional encoding mindset was, "You have THREE options: speed, quality and lower file size. Choose TWO." What was interesting is that x264 made it a point to have all three available on one encode. Somewhere around 2010-ish they seem to have achieved this, I believe, or so is said.
At any rate, given a situation where I'd have to pick two - I'd always make sure quality was the first choice for the long-term. Yes, viewing satisfaction for what could be years is better than whining about something that you can plan for an overnight encode here and there.
However, lately, when I want to watch something short-term, such as on my phone, then delete (when still retaining the source) then high quality and slower settings aren't as important. Just one good quick and dirty small file encode is good enough here.
But this is just me.I hate VHS. I always did. -
-
-
He means the "older days", which I guess is relative... But if you reference something from the previous generation, like mpeg4-asp (xvid/divx), or mpeg2, there was a point in time early on with x264 when you didn't have all 3 - the new kid on the block was actually worse than mpeg2; and a point in time eventually where produced faster, higher quality and lower file size than both xvid and mpeg2. New CPU achitectures, instructions and better threading also helped.
We don't have that right now with HEVC vs. AVC either, HEVC is just too slow - but it's early on for HEVC development. Hopefully we will be able to say the same thing about x265 and have a deja vu moment -
Originally Posted by poisondeathray
Actually, you can refer to that saying of picking two (among speed, file size and quality) with pretty much any codec or video format that is early in its cycle, which included x264. I do remember a time (was it somewhere between 2005-2009?) when x264 was S-S-S-SSSSSSSSSLLLLLLLLLLOOOOOOOOOOO-O-O-O-WWW and you had to compromise some form of quality or file size to speed it up. You couldn't pick all three.
But we all know x264 has matured way beyond that today. I can now get good quality with x264, and low file sizes, at speeds that have dramatically increased - speeds that are better today even when testing on a 2006 PC I used with my older x264 encodes. Even under the same platform, and similar circumstances, I notice the changes, which is what I meant by getting all three now, relatively speaking.
We've even acheived this with MPEG-2 - such as with a good encoder like CCE - fast, high quality, and good file size compared to the MPEG-2 of the older days. And we have achieved this for MPEG-4 ASP formats like DivX/Xvid and pretty much achieved this with H.264 with x264.
As for x265, I believe that's coming too (getting all three - speed, quality and low file size). And yeah, I'm getting déjà vu all over again going way back to the DivX/Xvid vs x264 days, or even the DivX vs MPEG-1/2 era.
At least this time around we don't have to deal with that (ugghhh!) ridiculous "fit-it-on-a-CD" mindset.I hate VHS. I always did. -
After that DivX/Xvid vs x264 era where I was much more actively involved, and before that watching the development of DivX from 2001-ish, I confess the pains of playing with a new codec (or whatever it's appropriately called today), and all that debate and claptrap, got wearisome, which is why I haven't really played with x265 more than a bit. I admit I was sitting it out a bit this time.
Ok, I'll stop whining. Although experimenting with a developing "next thing" is masochistic in nature, it can be pleasing. I can't say I don't love this stuff.
Will play around with x265 more now.I hate VHS. I always did. -
-
Well, yes and no, and depends on what you're talking about, and which benchmarks.
Yes, it's always true in theory that you have only two, at any point in time, regardless of how mature a lossy codec is.
But in practice, over time, you can significantly get improvements in all three, all on the same encode, and all on the same platform, which relatively means you can get all three when comparing benchmarks before and after.
The need to pick two at the maturity level is when all benchmarks have been raised.I hate VHS. I always did. -
All you're saying is codecs get better with time. The bars move with time but at any point in time you can only have two of the the three goals.
-
-
I can also say the advantage of choosing any two of each of the three goals diminishes with time, or produces less benefit overall.
Well, if you really want hardcore theory, as such written on life cycles, maturity rates, and the law of diminishing returns, then over an infinite amount of time you will have all three choices, since the rate of change, or even the deltas, or whatever, asymptotically reduce over time to zero. In other words, you will have less and less benefit over the timeline in choosing two (lesser and lesser speed increases, lesser and lesser bitrate compression, or lesser and lesser quality retention).
Of course, this is ridiculous and will never happen in the real world, just like the fact that you can’t choose an infinite amount of bitrate to make a lossy codec produce a lossless encode, other than maybe theory.
In practice, since choosing two isn’t as sensitive at the mature level of the codec (maybe even insignificant in some cases), even with present benchmarks, which now arguably applies to the current and proper usage of x264, you may as well have all three choices, regardless of the finite theory still stating you have two (of which you are still correct with).
My head is spinning now.Last edited by PuzZLeR; 25th Sep 2015 at 14:33.
I hate VHS. I always did.
Similar Threads
-
Bit rate question
By SameSelf in forum Video Streaming DownloadingReplies: 3Last Post: 21st Sep 2015, 13:00 -
Bit Depth,Sampling Rate used for Uncompressed Audio-Bit Rate for Compressed
By alexander121 in forum AudioReplies: 9Last Post: 4th Apr 2015, 10:30 -
Bit Rate Question
By comp1mp in forum Video Streaming DownloadingReplies: 13Last Post: 14th Jan 2014, 01:44 -
Overall Bit Rate, Bit Rate with Respect to Quality and Filesize
By kingaddi in forum DVD RippingReplies: 113Last Post: 17th Sep 2012, 21:56 -
What should slide show bit rate and frame rate be in video editor?
By johnharlin in forum Video ConversionReplies: 0Last Post: 11th Sep 2012, 21:00