VideoHelp Forum




+ Reply to Thread
Results 1 to 9 of 9
  1. Hello fellas. I would really appreciate some feedback regarding this issue I'm confused about. Thanks in advance.
    Suppose, we have two MP4 files of exact size, frame rate and length but different resolutions.

    File A: 1920x1080, 3Gb, 60mins, 30fps, 6724kbps (0.108 bits/pixel)

    File B: 1280x720, 3Gb, 60mins, 30fps, 6724kbps (0.243 bits/pixel)

    Which one would practically look better on a standard 1080p monitor/TV? And/Or Will it depend on the hardware how the video is upscaled in case of 720p?
    (Consider a normal footage with not too much fast movement)


    Sent from my iPhone 6+ using Tapatalk
    Quote Quote  
  2. It's not enough information to answer. Assuming they are the from the same direct source, the specific compression and encoder used, encoding settings, can make a significant difference. Yes, method of scaling can make a difference. If all other factors are the same, and you are deciding how to encode - just do some representative tests.
    Quote Quote  
  3. Even the properties of the video itself make a difference. Is it very sharp to begin with? Is there a lot of grain? And your viewing habits. Are you going to watch sitting very close to a large 1080p screen? Sitting 10 feet away from a 32 inch screen? There's no general answer.
    Quote Quote  
  4. Thanks for feedback. Assume the files are encoded from the same RAW camera file using same encoding settings except difference in resolution and bpp.

    I'm on the move mostly so don't have access to big screen at the moment but planning to buy a HDTV when I return home. I was asking this question as I come across many files on the web which have almost same bitrate but different resolutions. I download files on the cloud whenever I'm free for future use. Long story short I would be happy to know the general idea perceived among the professionals.


    Sent from my iPhone 6+ using Tapatalk
    Quote Quote  
  5. Originally Posted by jagabo View Post
    There's no general answer.
    OK thanks.



    Sent from my iPhone 6+ using Tapatalk
    Quote Quote  
  6. I'm not sure I understand the question, because to me the answer seems obvious. If both files are the same size, then they must be the same bitrate. This is always true if they are encoded using constant bitrate, because that is the ONLY thing that determines size. So, if this is the case, the file that is upscaled to higher resolution will, for most normal video, have more artifacts because that same number of bits per second must "chase around" more pixels.

    If you instead use a "constant quality" encoder, you end up with the same conclusion, because the only way to get the same file size with a higher resolution version of the same video is to use a lower quality setting on the higher resolution version.

    Finally, jagabo, as usual, is correct that in extreme cases, you could get similar results. For instance, if you took video of a building using a camera mounted on a tripod, then there would be no movement, and you could get really good results even with an absurdly low bitrate. However, this is what is sometimes called a "pathological case" and it doesn't represent the real world. And yes, if your video is noisy, that will make the quality differences show up even more.

    It seems like a lot of people in these forums are wanting to upscale their video. I realize that advanced software can do a better job deinterlacing, scaling, and denoising than what is built into some TV sets, but current-generation sets do these things pretty darned well, and I'll bet the differences are so small that 990 out of 1,000 people, including those in this forum, wouldn't be able to see the differences, and even then, only for a few frames.

    So, I'd encode at the native resolution, and not upscale. Just my 2-cents.
    Quote Quote  
  7. Originally Posted by johnmeyer View Post
    I'm not sure I understand the question, because to me the answer seems obvious. If both files are the same size, then they must be the same bitrate. This is always true if they are encoded using constant bitrate, because that is the ONLY thing that determines size. So, if this is the case, the file that is upscaled to higher resolution will, for most normal video, have more artifacts because that same number of bits per second must "chase around" more pixels.

    If you instead use a "constant quality" encoder, you end up with the same conclusion, because the only way to get the same file size with a higher resolution version of the same video is to use a lower quality setting on the higher resolution version.

    Finally, jagabo, as usual, is correct that in extreme cases, you could get similar results. For instance, if you took video of a building using a camera mounted on a tripod, then there would be no movement, and you could get really good results even with an absurdly low bitrate. However, this is what is sometimes called a "pathological case" and it doesn't represent the real world. And yes, if your video is noisy, that will make the quality differences show up even more.

    It seems like a lot of people in these forums are wanting to upscale their video. I realize that advanced software can do a better job deinterlacing, scaling, and denoising than what is built into some TV sets, but current-generation sets do these things pretty darned well, and I'll bet the differences are so small that 990 out of 1,000 people, including those in this forum, wouldn't be able to see the differences, and even then, only for a few frames.

    So, I'd encode at the native resolution, and not upscale. Just my 2-cents.
    Well thanks for your valuable information!


    Sent from my iPhone 6+ using Tapatalk
    Quote Quote  
  8. Originally Posted by johnmeyer View Post

    It seems like a lot of people in these forums are wanting to upscale their video. I realize that advanced software can do a better job deinterlacing, scaling, and denoising than what is built into some TV sets, but current-generation sets do these things pretty darned well, and I'll bet the differences are so small that 990 out of 1,000 people, including those in this forum, wouldn't be able to see the differences, and even then, only for a few frames.

    So, I'd encode at the native resolution, and not upscale. Just my 2-cents. .


    I interpreted it as the native resolution was 1080p30 from the camera. He's deciding on whether to downscale to 720 or keeping 1080 for that 6.7Mb/s bitrate range . For some reason he needs a fixed capacity 3GB target. I think the "upscaling" was just on playback, if he decided downscaling to 720

    By "RAW" he probably didn't mean actual camera raw, he probably meant lossy compressed originals from the camera . Something like AVCHD or some consumer format

    6-7Mb/s for 1080p is considered "lowish bitrate" . Consider that typical consumer acquisition bitrates are 20-40 MB/s for the lossy source . If you had a high quality source, with low motion , perfect lighting and used high compression settings for re-encoding - it usually won' t look very good, not compared to the original. You usually have to preprocess / selectively denoise to optimize for that bitrate range for 1080p . So the question is more aptly reframed in that cases as : which won't look as "bad" 1080 or 720 , and there is no general answer. And if you had a cheapish consumer camera, often the original "1080" is more like much lower than 720 lines of actually resolvable detail to begin with. Very soft. There is no benefit to 1080. 720 will always look better in those cases at that bitrate. The point is - it really depends on source characteristics.
    Quote Quote  
  9. Thanks for the explanation. I clearly did not fully understand what he is trying to do.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!