Hi,
Can anyone tell me the difference between throughput and bitrate in simple words or with some example? Is there any correlation between the two.?
Support our site by donate $5 directly to us Thanks!!!
Try StreamFab Downloader and download streaming video from Netflix, Amazon!
Try StreamFab Downloader and download streaming video from Netflix, Amazon!
+ Reply to Thread
Results 1 to 12 of 12
-
-
Throughput is bits per second which is coming to your machine.
I'm watching a video which has been set to 1000 kbps bitrate. But the throughput i'm observing is 2500-3000 kbps.
It is very difficult for me to understand what does bitrate actually means. -
Bitrate is the average number of bits used, per second (or other unit time) to encode a video. So 1000 kbps means 1,000,000 bits are used every second. If the video is running at 25 frames per second each frame gets 40,000 bits (1,000,000 / 25) on average (some will get more, some less). 40,000 bits is 5,000 bytes (one byte = 8 bits). A video may have more bitrate during some shots and less during others. The average bitrate reported by a program should be the average over the entire video. If you are using a realtime bitrate monitor (like the Status window in MPCHC), or a graphical bitrate viewer it may show different bitrates at different times.
Throughput is the measure of how much of something is passing through something. We need some context here. What program is reporting throughput? And to what is it referring? Your video file probably also has audio which has its own bitrate. And there may be some container overhead. And, depending on what the program is referring to, there may be transmission overhead too. -
Thanks Jagabo..
I'm viewing an application similar like youtube...so basically, I'm seeing the throughput of both video + audio...I know that throughput is the no. of bits coming per second...
Coming back to bitrate...
1) When does this bitrate come into picture..? While uploading the video or while downloading (streaming) the video..??
2) I'm assuming there should not be any relation of bitrate with the throughput...I'm i right..??
3) I have heard people saying that the video is getting streamed at 1000 kbps bitrate..Is this statement correct..?? Basically when the streaming comes into picture, it should be throughput..the video is being streamed at say 2.5 Mbit/s -
Hi jagabo,
When you say that 1000 kbps bitrate means 1,000,000 bits are used every second. So, these many bits (1,000,000 bits) are transferred per second to the client machine. Now, i understand this thing as throughput...This is the place i have confusion.....Kindly help
-
Bitrate is simply a property of the video and audio streams. You specify the bitrate when you encode. The bitrate "comes into play" anytime you do something with the video that involves it.
There should be some relationship between bitrate and throughput. The higher the bitrate the higher the throughput.
Yes. As long as they're talking about the bitrate of the video.
It depends on what you are talking about. The payload or the packaging. And exactly what you mean by throughput. The term throughput is vague -- how much of what, through what -- you have to precisely define what you mean. -
-
When i play the video (which is encoded at 1000 kbps bitrate), i do a Wireshark capture. Wireshark is a sniffer. It captures all the packets coming and going out of your computer. When i see the throughput graph in that, it says that the bits which have come into your laptop is at 2.5 Mbit/sec = 2500 kbps rate. This is what i basically mean by throughput graph.
Now, in another test, when my internet speed is 2Mbps. The throughput observed is 250 kbps for the same video (encoded at 1000 kbps).
So, answering your question, i'm getting the throughput value from Wireshark throughput graph. -
Isn't throughput effectively the speed your computer is downloading the video? Assuming the throughput is
higher than the bitrate, the system simply buffers the extra.
I'm not sure how youtube controls the throughput speed. I noticed viewing some low-res videos, that it barely buffers beyond the playback point. But if I switch to 720p, the video buffers very quickly and is
completely downloaded after just a few seconds of playback - then playback continues to the end using the
data that has been buffered in the PC. -
Variable Bitrate
The instantaneous bitrate at any single point in time may vary , but over an average, say the entire length of video, it might be 1000kbps
There might be "peaks" or "valleys" . It might spike to 5000kbps at some points, down to 100kbps at others. But the average might be 1000kbps . (these are just made up numbers for illustration)
Same thing for transmission payload, or throughput. At times speed increases in bursts , other times it slows down. Many factors influence this, both client side and server side.
Payload will always be larger than total video & audio bitrate, because of overhead, lost packets
I'm not familiar with wireshark, but most sniff entire tranmission bandwidth, both up & down. So it might not accurate for a single video, even if you configure your firewall and router to block all other activiy -
Throughput measured at the packet level (I'm guessing that's what the network sniffer is showing you) should be higher than the video bitrate. TCP (used when playing videos across a LAN) has 10 to 20 percent overhead. Since you are seeing much more than that the player may be reading ahead, buffering the data locally. If that's the case you should see the network throughput drop to zero long before the video finishes playing.
Playing from a streaming server uses UDP which has less overhead. But you should still see a throughput higher than the video bitrate. Otherwise the player will have to stop playing and wait for more data. Depending on the type of video, it may contain multiple video streams at different bitrates. The server and player dynaically decide which stream is appropriate for the current network conditions and server load. In that case you may see a much lower throughput than the total bitrate of the video because you are only getting one of the streams.
For testing, I recommend you use a constant bitrate file you've made yourself so you know exactly what's in it. CBR encoding will eliminate a variable.
Also be sure the network sniffer is showing UDP and TCP traffic. Make sure protocols like RTMP aren't excluded.Last edited by jagabo; 20th Dec 2011 at 16:01.
Similar Threads
-
Should i put average bitrate or max bitrate in 2pass encoding mode?
By tendra in forum Video ConversionReplies: 28Last Post: 11th Nov 2011, 07:38 -
Question about bitrate, spikes and Bitrate viewer
By sasuweh in forum Authoring (DVD)Replies: 3Last Post: 25th Oct 2010, 15:01 -
Using mutiple passes with lower bitrate vs single pass with high bitrate on
By jones24 in forum Newbie / General discussionsReplies: 15Last Post: 14th Aug 2009, 18:17 -
Most compatible codec, Lossless bitrate, and max bitrate
By sevenlayercookie in forum Video ConversionReplies: 2Last Post: 6th May 2009, 20:43 -
Low bitrate source to High bitrate target
By sameerdhiman in forum Newbie / General discussionsReplies: 2Last Post: 7th Nov 2008, 23:54