VideoHelp Forum




+ Reply to Thread
Results 1 to 14 of 14
  1. ok so obviously,

    Higher bitrate = Higher quality = Larger file size

    So the question is which resolution to select. Say you capture 2 files with:

    Bitrate = 1500
    A1. With Resolution 320x200
    A2. With Resolution 720x480

    and then you capture 2 more with:

    Bitrate = 8000
    B1. With Resolution 320x200
    B2. With Resolution 720x480

    What are the pros and cons of these 4 files? Why would I choose B2 over B1 or A1 over A2 (obviously I would choose B over A). From tests I have done the only thing I 'think' I can see is that if the capture has high motion, selecting a lower resolution with a higher bitrate gives maybe less ghosting, yet selection a higher bitrate with a higher resolution maybe give a slightly shaper image in a slow moving capture? I say maybe because its real hard for me too see these differences. This is what I conclude should happen so maybe I'm seeing what I want to see.

    Would someone please open my eyes on this... what resolution should I select for what situation. Thanks

    rhuala
    Quote Quote  
  2. Bump... thought this was going to be easy question, should this question be moved to different area moderator?

    rhuala
    Quote Quote  
  3. Member mats.hogberg's Avatar
    Join Date
    Jul 2002
    Location
    Sweden (PAL)
    Search Comp PM
    Well, the theory will be like this:
    Depending on resolution, frame rate and color depth, there will be maximum bitrate. Raising the bitrate above this, will not in any way improve quality. Below this limit, the quality will suffer.
    The medium used to render these pixels also have a resolution. Going above that will also be pointless.
    In theory.
    In reality - Trust your eyes!

    /Mats
    Quote Quote  
  4. Member
    Join Date
    May 2001
    Location
    United States
    Search Comp PM
    Bump... thought this was going to be easy question, should this question be moved to different area moderator?
    Well, we all thought that this was just a joke... After all, you already answered your own question.

    Higher bitrate = Higher quality = Larger file size
    So, everything is a trade-off - if you are constrained to a particular file size, then you are going to have to lower the bitrate (and quality) to fit your video in.
    Quote Quote  
  5. Well, we all thought that this was just a joke... After all, you already answered your own question.
    So, everything is a trade-off - if you are constrained to a particular file size, then you are going to have to lower the bitrate (and quality) to fit your video in.
    You thought it was a joke, you didn't even read the question? Then fire off a response having nothing to do with the thread? *Boggle*

    To re-iterate - the fundamental question is "which resolution to select and why". Again, the pros and cons of selecting different resolutions.

    rhuala
    Quote Quote  
  6. Member
    Join Date
    May 2001
    Location
    United States
    Search Comp PM
    To re-iterate - the fundamental question is "which resolution to select and why". Again, the pros and cons of selecting different resolutions.
    Sorry, thought you were joking again...

    Higher resolution = Higher quality = Larger file size

    So, everything is a trade-off - if you are constrained to a particular file size, then you are going to have to lower the RESOLUTION (and quality) to fit your video in.
    Quote Quote  
  7. Member
    Join Date
    Sep 2002
    Location
    PA, USA
    Search Comp PM
    Originally Posted by rhuala2
    To re-iterate - the fundamental question is "which resolution to select and why". Again, the pros and cons of selecting different resolutions.

    rhuala
    I posted a question a while ago regarding the same thing, never really got a straight answer...
    my best guess would be that if you encode one at 352x240, it would have to be more stretched to fit the screen then a 480x480 would .. tho.. i could be very wrong.
    Quote Quote  
  8. Higher resolution = Higher quality = Larger file size

    So, everything is a trade-off - if you are constrained to a particular file size, then you are going to have to lower the RESOLUTION (and quality) to fit your video in.
    Sorry, but you don't know what ur talking about and giving bad advice, I don't mean to be rude here but it is a bit fustrating.

    Resolution has nothing to do with file size. Bitrate is what matters for file size and quality. If you capture one file with resolution at 320x240 with bitrate 8000 and another file at 720x480 with bitrate 8000 for same time period, files will be the same size... *sigh*

    rhuala
    Quote Quote  
  9. As far as just capturing-
    It is usually best to use the highest bitrate and framesize possible, as long as space isnt an issue. That way, you can always adjust it down to the target medium (vcd,dvd,etc).

    As for authoring-
    Usually, you'll pick the highest bitrate that the medium has space for. 1.5MBPS might give you an hour of video, while 8MBPS might give you 10 minutes on a (x)VCD.

    Framesize is more complicated, especially at lower bitrates. At 8MBPS, the 720x480 will look perfect, and it would be a waste to use 352x240.
    At 1.5MBPS, 352x240 will look near perfect, but 720x480 will be full of artifacts, especially at fast moving scenes, although it might be sutable for slow scenes.

    nick
    Quote Quote  
  10. Member
    Join Date
    May 2001
    Location
    United States
    Search Comp PM
    Okay, I see what you are asking... I'll try to redeem myself in your eyes.

    To begin with, let's forget about video, and just examine a hypothetical static 8x10 picture. If we choose 320x200 as our resolution, then we will, of course, have 320 homogeneous squares in the horizontal direction, and 200 homogeneous squares in the vertical direction. NOTE that these homogeneous squares have ABSOLUTELY NO DETAIL in them - if you were to view them separately, they would just be a single colored square (like a paint chip at Home Depot).
    If we choose 720x480 as our resolution, then these homogeneous squares would naturally be much smaller (roughly 1/4 the size as the other). But assemble these squares (like a mosaic) and much more detail can be shown.

    Detail comes into play when the viewing screen is large, or when the viewer is close. From across the room, a 320x200 image can look quite acceptable. However, get close and the lack of detail is very noticeable. Also, on a large viewing screen, the lack of detail for the 320x200 image is also very apparent. However, a 720x480 resolution image has better than 4 times the detail as the 320x200 image (so, I guess that you can sit 4 times as close, or view it on a screen 4 times as large and get the same visual effect).

    You are correct that the file size for a 8000kbps video will be the same for both. What I was thinking of was more like bits per pixel per second.

    As you might suspect, a video with higher resolutions will need a higher bitrate just to represent all the individual pixels. But once the pixels have been adequately represented, throwing more bitrate at them will not make them any sharper or better. So, encoding a 320x200 video at 8000kbps will most likely be a waste of 6000kbps.

    Here's something that you can do to demonstrate this effect to yourself. Take an image that is 720x480 at say 72 pixels/inch. Lower the pixels/inch to 32. Now, put the resolution back to 72 pixels/inch. Save this as a new image, then reload the original image, then compare the two. As you can see, the detail on our modified image is significantly less than the original - this is exactly what is happening when it is displayed on a viewing screen (like a TV), because a TV will scale the image to fill the screen. If you were to look at a TV screen in terms of resolution and assume that its "maximum" resolution is 720x480, you would see, if you were to examine the pixels, that each pixel of the 320x200 picture is made up of two pixels in both the horizontal and vertical.

    Now on to video. Let's assume that to acceptably encode a video stream, each PIXEL requires 8 bits to properly represent. So, a 320x200 resolution "frame" will require about 500,000 bits to represent. The 720x480 will require about 2,750,000 bits (or, > 4 times). And each second has about 24 frames in it (MPEG compression is not considered in these examples). So, if we encode at a minimally acceptable bits per pixel per second rate, our file size grows with our resolutions.

    Now this is an over-simplization, but I hope that this begins to explain what the resolution really means. Or maybe enough so that you can ask additional questions about the subject.

    Or, you can spank me again !
    Quote Quote  
  11. Member
    Join Date
    Sep 2002
    Location
    PA, USA
    Search Comp PM
    Originally Posted by rhuala2

    Bitrate = 1500
    A1. With Resolution 320x200
    A2. With Resolution 720x480
    So.. what you mean by...

    Originally Posted by SLK001
    Now on to video. Let's assume that to acceptably encode a video stream, each PIXEL requires 8 bits to properly represent. So, a 320x200 resolution "frame" will require about 500,000 bits to represent. The 720x480 will require about 2,750,000 bits (or, > 4 times). And each second has about 24 frames in it (MPEG compression is not considered in these examples). So, if we encode at a minimally acceptable bits per pixel per second rate, our file size grows with our resolutions.
    is that with Resolution 320x200 @ 1500kbps, you have 1500kbps to fill 500,000 bits .

    at resolution of 720x480 @1500kbps you have 1500kbps to fill 2,750,000 bits.

    so all in all, less bitrate is available if there is more bits to fill
    less bits to fill, the bitrate has better quality?

    my head hurts
    Quote Quote  
  12. Okay, I see what you are asking... I'll try to redeem myself in your eyes.
    I'm glad I didn't lash out at you for calling my post a joke without fully understanding my question. It's good to see you were big enough not to back pedal with silly innuendos or petty arguing (as happens so often). Let's just say we got off to a wrong start and ignore the first few posts.

    Detail comes into play when the viewing screen is large, or when the viewer is close...
    However, a 720x480 resolution image has better than 4 times the detail as the 320x200 image (so, I guess that you can sit 4 times as close, or view it on a screen 4 times as large and get the same visual effect

    As you might suspect, a video with higher resolutions will need a higher bitrate just to represent all the individual pixels. But once the pixels have been adequately represented, throwing more bitrate at them will not make them any sharper or better.
    Yes true. My examples were exagerated for simplicty. In reality we are choosing resolutions more like 352x480 or 480x480 or 720x480 and bitrates around 2000 to 6000. So at no point have we near enough bitrate to fully represent the picture, and of course if we go beyond we are just throwing pixels away.

    if we encode at a minimally acceptable bits per pixel per second rate, our file size grows with our resolutions.
    Not sure what you mean here please clarify? Which program uses "bits per pixel per second" anyways?

    Here's something that you can do to demonstrate this effect to yourself. Take an image that is 720x480 at say 72 pixels/inch. Lower the pixels/inch to 32. Now, put the resolution back to 72 pixels/inch. Save this as a new image, then reload the original image, then compare the two. As you can see, the detail on our modified image is significantly less than the original
    I think your analogy to a static picture is somewhat mis-leading because I am talking equal bitrate in your example this would hold true only if the 720x280 image had twice the bitrate (apples and oranges), jmo though...

    It is usually best to use the highest bitrate and framesize possible, as long as space isnt an issue.
    Of course space is always an issue or else I would save 900GB files at 3000x2000 resolution... j/k, but seriously my destination medium dictates everything so it's a given that I'll obviously select the bitrate that optimizes my medium.

    I posted a question a while ago regarding the same thing, never really got a straight answer...
    my best guess would be that if you encode one at 352x240, it would have to be more stretched to fit the screen then a 480x480 would .. tho.. i could be very wrong.
    I think u may be on to something there johneboy. Maybe the key is to try selecting the resolution that will best fit your TARGET viewing device. This way, as u say, if you choose a resolution too low, stretching will have to be done, and if you choose a resolution too high combining pixels will occur and thus the capture will be less efficient because pixels will be wasted. So choosing 720x480 even with a high bitrate may not be the best choice for a standard analog TV.

    Anyone have any opinions/experiences with the second part of my original post regarding ghosting vs. sharpness at different resolutions? Which if I'm correct I would add to the statement above: try choosing a lower resolution if you know your capture has fast moving scenes (i.e. hockey game)... With an qually high bitrate would a hockey game look better at 480x480 then 720x480 even on a HDTV?

    cheers,
    rhuala
    Quote Quote  
  13. Member
    Join Date
    Sep 2002
    Location
    PA, USA
    Search Comp PM
    hmmm... would it be sorta like this..
    if you have 1 can of paint, and you have to paint a wall..

    if you paint a wall that is 320x240 your going to get better quality, then if you have that same can of paint, and have to paint a wall that is 720x480
    Quote Quote  
  14. Yes true. My examples were exagerated for simplicty
    A better hypothetical example is the CVD vs SVCD.
    With a compliant SVCD, you are limited by a bitrate of 2520KBPS, and 2 framesizes: 480x480 and 352x480.

    With most action-type movies, 2520kbps is NOT enough to encode the video without obvious compression artifacts. The video looks good, but macroblocks on fast movement.

    OTOH, 2520kbps can usually do 352x480 without any obvious macroblocking, but looses a slight amount of detail.

    Which is better? There is no clear answer, it all depends on which is more important to you- Detail or Artifacts?

    For me, I'll usually encode action-type movies onto 352x480, and low-action onto 480x480.
    For captures, I'll generally stay with 352x480. Not because of bitrate, but I might want to burn it on DVD someday, and 352x480 is DVD compliant (and 480x480 is not).

    So, I guess it comes down to: Detail vs Artifacts.

    nick
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!