ok so obviously,
Higher bitrate = Higher quality = Larger file size
So the question is which resolution to select. Say you capture 2 files with:
Bitrate = 1500
A1. With Resolution 320x200
A2. With Resolution 720x480
and then you capture 2 more with:
Bitrate = 8000
B1. With Resolution 320x200
B2. With Resolution 720x480
What are the pros and cons of these 4 files? Why would I choose B2 over B1 or A1 over A2 (obviously I would choose B over A). From tests I have done the only thing I 'think' I can see is that if the capture has high motion, selecting a lower resolution with a higher bitrate gives maybe less ghosting, yet selection a higher bitrate with a higher resolution maybe give a slightly shaper image in a slow moving capture? I say maybe because its real hard for me too see these differences. This is what I conclude should happen so maybe I'm seeing what I want to see.
Would someone please open my eyes on this... what resolution should I select for what situation. Thanks
rhuala
+ Reply to Thread
Results 1 to 14 of 14
-
-
Bump... thought this was going to be easy question, should this question be moved to different area moderator?
rhuala -
Well, the theory will be like this:
Depending on resolution, frame rate and color depth, there will be maximum bitrate. Raising the bitrate above this, will not in any way improve quality. Below this limit, the quality will suffer.
The medium used to render these pixels also have a resolution. Going above that will also be pointless.
In theory.
In reality - Trust your eyes!
/Mats -
Bump... thought this was going to be easy question, should this question be moved to different area moderator?
Higher bitrate = Higher quality = Larger file size -
Well, we all thought that this was just a joke... After all, you already answered your own question.So, everything is a trade-off - if you are constrained to a particular file size, then you are going to have to lower the bitrate (and quality) to fit your video in.
To re-iterate - the fundamental question is "which resolution to select and why". Again, the pros and cons of selecting different resolutions.
rhuala -
To re-iterate - the fundamental question is "which resolution to select and why". Again, the pros and cons of selecting different resolutions.
Higher resolution = Higher quality = Larger file size
So, everything is a trade-off - if you are constrained to a particular file size, then you are going to have to lower the RESOLUTION (and quality) to fit your video in. -
Originally Posted by rhuala2
my best guess would be that if you encode one at 352x240, it would have to be more stretched to fit the screen then a 480x480 would .. tho.. i could be very wrong. -
Higher resolution = Higher quality = Larger file size
So, everything is a trade-off - if you are constrained to a particular file size, then you are going to have to lower the RESOLUTION (and quality) to fit your video in.
Resolution has nothing to do with file size. Bitrate is what matters for file size and quality. If you capture one file with resolution at 320x240 with bitrate 8000 and another file at 720x480 with bitrate 8000 for same time period, files will be the same size... *sigh*
rhuala -
As far as just capturing-
It is usually best to use the highest bitrate and framesize possible, as long as space isnt an issue. That way, you can always adjust it down to the target medium (vcd,dvd,etc).
As for authoring-
Usually, you'll pick the highest bitrate that the medium has space for. 1.5MBPS might give you an hour of video, while 8MBPS might give you 10 minutes on a (x)VCD.
Framesize is more complicated, especially at lower bitrates. At 8MBPS, the 720x480 will look perfect, and it would be a waste to use 352x240.
At 1.5MBPS, 352x240 will look near perfect, but 720x480 will be full of artifacts, especially at fast moving scenes, although it might be sutable for slow scenes.
nick -
Okay, I see what you are asking... I'll try to redeem myself in your eyes.
To begin with, let's forget about video, and just examine a hypothetical static 8x10 picture. If we choose 320x200 as our resolution, then we will, of course, have 320 homogeneous squares in the horizontal direction, and 200 homogeneous squares in the vertical direction. NOTE that these homogeneous squares have ABSOLUTELY NO DETAIL in them - if you were to view them separately, they would just be a single colored square (like a paint chip at Home Depot).
If we choose 720x480 as our resolution, then these homogeneous squares would naturally be much smaller (roughly 1/4 the size as the other). But assemble these squares (like a mosaic) and much more detail can be shown.
Detail comes into play when the viewing screen is large, or when the viewer is close. From across the room, a 320x200 image can look quite acceptable. However, get close and the lack of detail is very noticeable. Also, on a large viewing screen, the lack of detail for the 320x200 image is also very apparent. However, a 720x480 resolution image has better than 4 times the detail as the 320x200 image (so, I guess that you can sit 4 times as close, or view it on a screen 4 times as large and get the same visual effect).
You are correct that the file size for a 8000kbps video will be the same for both. What I was thinking of was more like bits per pixel per second.
As you might suspect, a video with higher resolutions will need a higher bitrate just to represent all the individual pixels. But once the pixels have been adequately represented, throwing more bitrate at them will not make them any sharper or better. So, encoding a 320x200 video at 8000kbps will most likely be a waste of 6000kbps.
Here's something that you can do to demonstrate this effect to yourself. Take an image that is 720x480 at say 72 pixels/inch. Lower the pixels/inch to 32. Now, put the resolution back to 72 pixels/inch. Save this as a new image, then reload the original image, then compare the two. As you can see, the detail on our modified image is significantly less than the original - this is exactly what is happening when it is displayed on a viewing screen (like a TV), because a TV will scale the image to fill the screen. If you were to look at a TV screen in terms of resolution and assume that its "maximum" resolution is 720x480, you would see, if you were to examine the pixels, that each pixel of the 320x200 picture is made up of two pixels in both the horizontal and vertical.
Now on to video. Let's assume that to acceptably encode a video stream, each PIXEL requires 8 bits to properly represent. So, a 320x200 resolution "frame" will require about 500,000 bits to represent. The 720x480 will require about 2,750,000 bits (or, > 4 times). And each second has about 24 frames in it (MPEG compression is not considered in these examples). So, if we encode at a minimally acceptable bits per pixel per second rate, our file size grows with our resolutions.
Now this is an over-simplization, but I hope that this begins to explain what the resolution really means. Or maybe enough so that you can ask additional questions about the subject.
Or, you can spank me again!
-
Originally Posted by rhuala2
Originally Posted by SLK001
at resolution of 720x480 @1500kbps you have 1500kbps to fill 2,750,000 bits.
so all in all, less bitrate is available if there is more bits to fill
less bits to fill, the bitrate has better quality?
my head hurts
-
Okay, I see what you are asking... I'll try to redeem myself in your eyes.
Detail comes into play when the viewing screen is large, or when the viewer is close...
However, a 720x480 resolution image has better than 4 times the detail as the 320x200 image (so, I guess that you can sit 4 times as close, or view it on a screen 4 times as large and get the same visual effect
As you might suspect, a video with higher resolutions will need a higher bitrate just to represent all the individual pixels. But once the pixels have been adequately represented, throwing more bitrate at them will not make them any sharper or better.
if we encode at a minimally acceptable bits per pixel per second rate, our file size grows with our resolutions.
Here's something that you can do to demonstrate this effect to yourself. Take an image that is 720x480 at say 72 pixels/inch. Lower the pixels/inch to 32. Now, put the resolution back to 72 pixels/inch. Save this as a new image, then reload the original image, then compare the two. As you can see, the detail on our modified image is significantly less than the original
It is usually best to use the highest bitrate and framesize possible, as long as space isnt an issue.
I posted a question a while ago regarding the same thing, never really got a straight answer...
my best guess would be that if you encode one at 352x240, it would have to be more stretched to fit the screen then a 480x480 would .. tho.. i could be very wrong.
Anyone have any opinions/experiences with the second part of my original post regarding ghosting vs. sharpness at different resolutions? Which if I'm correct I would add to the statement above: try choosing a lower resolution if you know your capture has fast moving scenes (i.e. hockey game)... With an qually high bitrate would a hockey game look better at 480x480 then 720x480 even on a HDTV?
cheers,
rhuala -
hmmm... would it be sorta like this..
if you have 1 can of paint, and you have to paint a wall..
if you paint a wall that is 320x240 your going to get better quality, then if you have that same can of paint, and have to paint a wall that is 720x480 -
Yes true. My examples were exagerated for simplicty
With a compliant SVCD, you are limited by a bitrate of 2520KBPS, and 2 framesizes: 480x480 and 352x480.
With most action-type movies, 2520kbps is NOT enough to encode the video without obvious compression artifacts. The video looks good, but macroblocks on fast movement.
OTOH, 2520kbps can usually do 352x480 without any obvious macroblocking, but looses a slight amount of detail.
Which is better? There is no clear answer, it all depends on which is more important to you- Detail or Artifacts?
For me, I'll usually encode action-type movies onto 352x480, and low-action onto 480x480.
For captures, I'll generally stay with 352x480. Not because of bitrate, but I might want to burn it on DVD someday, and 352x480 is DVD compliant (and 480x480 is not).
So, I guess it comes down to: Detail vs Artifacts.
nick
Similar Threads
-
MPEG Question regarding specs (Bitrate, Ratio, Resolution)
By hcowan in forum Newbie / General discussionsReplies: 5Last Post: 31st Jan 2011, 15:26 -
Bitrate and resolution advice.
By ShinKyo in forum Video ConversionReplies: 9Last Post: 20th May 2010, 13:01 -
Newbie question about resolution and quality
By Sutekh in forum Newbie / General discussionsReplies: 1Last Post: 14th Sep 2009, 18:01 -
Determine best bitrate for a given resolution
By cd090580 in forum Newbie / General discussionsReplies: 7Last Post: 29th Oct 2008, 16:32 -
Bitrate Vs. Resolution
By KibaOokaminosuke in forum DVD RippingReplies: 4Last Post: 2nd Jul 2007, 18:35