If I'm re-encoding from 1080p h.264 to 480p h.264, that's lossy to lossy, right? So I'm taking a greater hit than just loss of pixels/resolution, correct?
How much data am I generally losing when I encode to 1080p h.264 and is it sufficient to make a considerable difference going down to 480p?
What about 4K? Is taking lossy 4K to 480p going to involve much of a hit in quality?
+ Reply to Thread
Results 1 to 5 of 5
Any time you reduce an image from 1/9th to 1/16th of its original resolution you can expect to lose some quality. If you're watching on a phone you may not see a lot of the difference.
As was pointed out, the resolution and size of the display you're viewing the video on will make a difference. watching a sharp 2160p video on a tiny 480p screen may not look any different that watching a 480p version of the video on that screen. But watching that 2160p video on a big 2160p screen will be much sharper than watching the 480p video.
if you are playing from a device with 480 native resolution, just make sure bitrate is high enough.. more action scenes higher bitrate..
just don't expect to get good results whwn play with hd devices..
One good thing, though.
Assuming you viewed each NATIVELY (no scaling), lossy errors/artifacts and blockiness are minimized as the 1stgen encoder error is distributed among ~4 pixels (the amount of downscale). Not as good as a direct downscale & encode from a lossless source, but the next best thing.
Of course, who's going to be viewing the SD natively?