VideoHelp Forum


Try StreamFab All-in-One and rip streaming video! Or Try DVDFab and copy Blu-rays! or rip iTunes movies!
+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 41
Thread
  1. Hi

    I'm looking for free and easy to use converter that can add black bars to videos making them 16x9 aspect ratio (source videos are mainly 1920x800 mkv x265 HEVC). My TV has motionflow problems with other aspect ratios.
    Quote Quote  
  2. Member
    Join Date
    Mar 2008
    Location
    United States
    Search Comp PM
    You can use Vidcoder (or Handbrake). Here's an example where I added the black bars top and bottom for a 720p video to
    increase the height to full 16/9 aspect ratio. Just plug the desired width and height under "sizing" (eg. 1920x1080) , select
    "fill" on the right and it will calculate the value
    Image Attached Thumbnails Click image for larger version

Name:	resize.png
Views:	3756
Size:	34.7 KB
ID:	44820  

    Quote Quote  
  3. Thank You so much! Could you please give me some advice - my source is 10bit 2160p x265 hevc mkv. I tried to make it the same, but it took too long, so I made mp4 x264. The file is bigger (it's not a problem) but i'm not sure about the quality. What would be the best video setting (quality/time)? I don't want to lose too much
    Quote Quote  
  4. Member
    Join Date
    Jul 2006
    Location
    United States
    Search Comp PM
    Originally Posted by davexnet View Post
    You can use Vidcoder (or Handbrake). Here's an example where I added the black bars top and bottom for a 720p video to
    increase the height to full 16/9 aspect ratio. Just plug the desired width and height under "sizing" (eg. 1920x1080) , select
    "fill" on the right and it will calculate the value
    My version of Handbrake does not have the "Pad" checkbox. Did it get removed from stable builds?
    Quote Quote  
  5. My version of Handbrake does not have the "Pad" checkbox. Did it get removed from stable builds?
    Been a long time been a long time been a lonely lonely lonely time since I used either of them, but it would seem like the above screenshot posted 1025 days ago (time flies, and flies don't fly any faster these days) was from Vidcoder, not Handbrake.

    If the O.P. is still stuck in a Groundhog Day time loop waiting for an answer in order to move on to the next level : a CRF value of about 20 is generally considered as nearly “transparent” to the source, i.e. quality loss should be visually negligible — but file size would be quite huge with a 2160p source. Then choose the slowest preset (“medium” / “slow” / “slower” / “veryslow”...) that is not considered too slow (depends on the computer's power, and quite subjective). It can be a good idea to run a few tests on a 1min. sample (ideally sufficiently representative of the whole thing — for instance if it's a James Bond movie, the sample could be cut across a slow cosy talk scene and a fast action packed scene — or a bed scene which can be both at the same time), with various values of CRF and various presets, then extrapolate the approximate total size and total encoding time for each combination, and decide what's the best compromise. Usually, the slower presets result in reduced file sizes for a similar quality, or a better quality for a similar size — although I've read statements insisting that the “veryfast” preset was an outlier and tended to produce the smallest file sizes with negligible quality loss, at least with low CRF values. The general consensus among trustworthy video experts is that it's rarely worth fiddling with other settings, which are already optimized for each preset.

    A wise man once said :
    What's important to you? Speed, quality? or size? (pick two!)
    Said wise man also wisecracked :
    Your question (in real life terms) is the same as me saying: "I looked for a new girlfriend that is as close to my ex girlfriend as possible. I settled for the girl at my local bar but she didn't convince me!"
    And I tongue-in-cheeked :
    Well, was she more into speed and size, size and quality, or speed and quality ?
    Quote Quote  
  6. Member
    Join Date
    Jul 2006
    Location
    United States
    Search Comp PM
    Originally Posted by abolibibelot View Post
    it would seem like the above screenshot posted 1025 days ago

    If the O.P. is still stuck in a Groundhog Day time loop waiting for an answer
    I revived the thread, not the OP. My question was just about the feature that allows padding a video with black bars, not speed or quality. And your answer was correct, it's Vidcoder, not Handbrake. If you wonder why I need the padding, it's to prevent YouTube from downgrading my uploads to 360p. If a video is even one line shorter than 480, YouTube no longer considers it 480p, so padding with black bars is a workaround to keep whatever quality the video had.
    Quote Quote  
  7. Originally Posted by Knocks View Post
    If you wonder why I need the padding, it's to prevent YouTube from downgrading my uploads to 360p. If a video is even one line shorter than 480, YouTube no longer considers it 480p, so padding with black bars is a workaround to keep whatever quality the video had.
    Why not just resize it to 480p? So what if the width increases a little?
    Quote Quote  
  8. Member dellsam34's Avatar
    Join Date
    Jan 2016
    Location
    Member Since 2005, Re-joined in 2016
    Search PM
    If youtube is the issue just resize to 1440x1080 if the original content is 4:3 or 1920x1080 if the original content is 16:9, Youtube will encode both as 1080p.
    Quote Quote  
  9. Member azmoth's Avatar
    Join Date
    Jan 2010
    Location
    Europe.
    Search Comp PM
    I sometimes use Wonderfox HD Converter free version, but mostly to add black bars I've used Super Simple Video Converter which also accepts hevc as input. It doesn't crop out anything if there to maintain the aspect and if not there(botched aspect) it will correct it.
    Quote Quote  
  10. Member
    Join Date
    Jul 2006
    Location
    United States
    Search Comp PM
    Originally Posted by dellsam34 View Post
    If youtube is the issue just resize to 1440x1080 if the original content is 4:3 or 1920x1080 if the original content is 16:9, Youtube will encode both as 1080p.
    I hate uploaders who deceive viewers with fake HD uploads. Preventing YouTube from reducing your original resolution by adding minimal black bars is one thing, but trying to pass off shitty digital zoom for HD is not acceptable.
    Quote Quote  
  11. Member dellsam34's Avatar
    Join Date
    Jan 2016
    Location
    Member Since 2005, Re-joined in 2016
    Search PM
    Originally Posted by Knocks View Post
    I hate uploaders who deceive viewers with fake HD uploads. Preventing YouTube from reducing your original resolution by adding minimal black bars is one thing, but trying to pass off shitty digital zoom for HD is not acceptable.
    Is it acceptable to watch 360p and 0.6Mbps? It's youtube's fault to butcher videos bellow 720p not the uploader, If they didn't have such rule 480p should look decent not like chicken shit, So take your complaint to youtube. Besides is it not what your monitor/TV is doing anyway? Are you going to yell at your monitor why is it upscaling to 1080/2160?
    Quote Quote  
  12. Member
    Join Date
    Jul 2006
    Location
    United States
    Search Comp PM
    You're making too many assumptions. On my PC, I watch videos at original resolution in windowed mode without zooming or maximizing to full screen. I also don't use full screen on YouTube if a video's resolution is less than my monitor's resolution. Watching videos pixel to pixel always looks better than any kind of digital zoom or upscaling (unless you're using the newer kind of AI-assisted upscalers, but that's off topic here).

    So yeah, two wrongs don't make a right. YouTube does reduce the bitrate and sometimes reduces the resolution in outlier cases, but that's absolutely not an excuse to upscale VHS videos to 1080p.
    Quote Quote  
  13. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    What dellsam34 is trying to say, that you still aren't getting, is that if you do NOT upscale your 480p stuff, YT will re-encode to truly poor quality. Whereas if you upscale to, say 2x, you can control the scaling AND you can prevent YT from re-encoding in such crappy way, instead forcing them to encode it to merely mediocre.

    I think that preventative correction of a mistake is the exact definition of two wrongs making a right.


    Scott
    Quote Quote  
  14. What dellsam34 is trying to say, that you still aren't getting, is that if you do NOT upscale your 480p stuff, YT will re-encode to truly poor quality. Whereas if you upscale to, say 2x, you can control the scaling AND you can prevent YT from re-encoding in such crappy way, instead forcing them to encode it to merely mediocre.
    Is that a fact ? I checked three videos I downloaded lately :
    – DBenRlPzb_s -f 22 (1280x720) => 1110kbps, Bits/(Pixel*Image) = 0.040
    – mbPu3aJXJDk -f 135+140 (640x480) => 864kbps, Bits/(Pixel*Image) = 0.113
    – TVNL8bBjrZ4 -f 18 (560x320) => 384kbps, Bits/(Pixel*Image) = 0.072
    0.113 > 0.072 > 0.040
    Quote Quote  
  15. Member dellsam34's Avatar
    Join Date
    Jan 2016
    Location
    Member Since 2005, Re-joined in 2016
    Search PM
    Here, tell me which one you want to watch, Both are my videos from the exact same crappy source (S-VHS recording):

    https://youtu.be/oOC71r5lBGk

    https://youtu.be/tbYhKvok5N4
    Last edited by dellsam34; 28th Dec 2020 at 18:26.
    Quote Quote  
  16. Member
    Join Date
    Jul 2006
    Location
    United States
    Search Comp PM
    Originally Posted by Cornucopia View Post
    What dellsam34 is trying to say, that you still aren't getting, is that if you do NOT upscale your 480p stuff, YT will re-encode to truly poor quality. Whereas if you upscale to, say 2x, you can control the scaling AND you can prevent YT from re-encoding in such crappy way, instead forcing them to encode it to merely mediocre.
    Care to back up anything that you wrote?
    Quote Quote  
  17. Member dellsam34's Avatar
    Join Date
    Jan 2016
    Location
    Member Since 2005, Re-joined in 2016
    Search PM
    Why would he back up facts? Even 720p now has been added to the list of shitty compression, So enjoy the sate of the art SD compression:
    https://micky.com.au/youtube-reclassifies-720p-as-sd-quality-amid-4k-trend/
    Quote Quote  
  18. Member dellsam34's Avatar
    Join Date
    Jan 2016
    Location
    Member Since 2005, Re-joined in 2016
    Search PM
    Originally Posted by abolibibelot View Post
    Is that a fact ? I checked three videos I downloaded lately :
    – DBenRlPzb_s -f 22 (1280x720) => 1110kbps, Bits/(Pixel*Image) = 0.040
    – mbPu3aJXJDk -f 135+140 (640x480) => 864kbps, Bits/(Pixel*Image) = 0.113
    – TVNL8bBjrZ4 -f 18 (560x320) => 384kbps, Bits/(Pixel*Image) = 0.072
    0.113 > 0.072 > 0.040
    Those are all SD videos, what's your point?
    Quote Quote  
  19. Member
    Join Date
    Jul 2006
    Location
    United States
    Search Comp PM
    Originally Posted by dellsam34 View Post
    Why would he back up facts? Even 720p now has been added to the list of shitty compression, So enjoy the sate of the art SD compression:
    https://micky.com.au/youtube-reclassifies-720p-as-sd-quality-amid-4k-trend/
    Oh OK. For a minute I actually thought that maybe you knew what you were talking about. But then you quoted an article that's literally just about a change in labeling and has zero discussion about quality, compression, bitrate or anything else that would be on the topic of this discussion.
    Quote Quote  
  20. Member dellsam34's Avatar
    Join Date
    Jan 2016
    Location
    Member Since 2005, Re-joined in 2016
    Search PM
    Originally Posted by Knocks View Post
    Oh OK. For a minute I actually thought that maybe you knew what you were talking about. But then you quoted an article that's literally just about a change in labeling and has zero discussion about quality, compression, bitrate or anything else that would be on the topic of this discussion.
    If you don't believe that youtube applies different compression schemes on SD, HD and UHD just lighten up yourself by googling it, or ask the coding experts here they will clue you up.
    Quote Quote  
  21. Video Restorer lordsmurf's Avatar
    Join Date
    Jun 2003
    Location
    dFAQ.us/lordsmurf
    Search Comp PM
    Originally Posted by dellsam34 View Post
    Here, tell me which one you want to watch, Both are my videos from the exact same crappy source (S-VHS recording):
    https://youtu.be/oOC71r5lBGk
    https://youtu.be/tbYhKvok5N4
    My eyes!
    That 2016 has bad AR, and interlace in progressive. Yuck.
    Want my help? Ask here! (not via PM!)
    FAQs: Best Blank DiscsBest TBCsBest VCRs for captureRestore VHS
    Quote Quote  
  22. @dellsam34
    Those are all SD videos, what's your point?
    Well, 1) “SD” is just a meaningless buzzword, actually YouTube decided to change its meaning overnight ; 2) the point that was expressed above (and which I've seen expressed here previously, in particular in reply to a question I had asked some time ago) was : videos with a lower than “720p” resolution (which is not even very clear as, for instance, a 1280x528 video in 2:40 AR is generally labeled as “720p” — what happens when such content gets uploaded on YouTube, is it considered as “720p” indeed and left unchanged, or downscaled to 1152x480 ?) should be upscaled to “720p” to improve their quality (or not degrade it as much) because presumably YouTube's encoder allocates proportionally more bitrate to “720p” and higher resolutions than to “480p” and below. I was merely challenging that assertion.

    I can add one in bona fide “HD”, whatever that means, if it makes the point more to the point :
    7bgndW7enwE -f 137+140 (1920x1080) => 2282kbps, Bits/(Pixel*Image) = 0.037

    Another test, with a video I downloaded both in “720p” and “1080p” (warning: not for the faint-hearted! that's why you shouldn't run with a refrigerator on your back...) :
    AZmdw9_vAX8 -f 22 (1280x720) => 1551kbps, Bits/(Pixel*Image) = 0.067
    AZmdw9_vAX8 -f 137+140 (1920x1080) => 2757kbps, Bits/(Pixel*Image) = 0.053
    0.053 < 0.067

    Side question would be : is this “bits/(pixel*image)” metric provided by MediaInfo relevant at all ?
    If not, what would be a better metric to appreciate the “proportional” bitrate allocation of videos in different resolutions ?


    @Knocks
    Oh OK. For a minute I actually thought that maybe you knew what you were talking about. But then you quoted an article that's literally just about a change in labeling and has zero discussion about quality, compression, bitrate or anything else that would be on the topic of this discussion.
    Indeed, that was beside the point, thank you for pointing it out.
    Now the point has left the room I'm afraid and all that's left is the elephant.
    Quote Quote  
  23. Bitrate for a given resolution is more meaningful than Bits/(Pixel*Image), which is only really useful when comparing similar non-16/9 resolutions or SD resolutions. At the time when this was added to mediainfo exotic AR/resolutions were common.

    It's preferable to stick to bitrate when comparing 720p vs 1080p. The standard bitrate vs resolution curve is available for various codecs/encoding settings. Constant Bits/(Pixel*Image) at different resolutions doesn't mean same quality.
    Quote Quote  
  24. Member dellsam34's Avatar
    Join Date
    Jan 2016
    Location
    Member Since 2005, Re-joined in 2016
    Search PM
    Originally Posted by abolibibelot View Post
    Side question would be : is this “bits/(pixel*image)” metric provided by MediaInfo relevant at all ?
    If not, what would be a better metric to appreciate the “proportional” bitrate allocation of videos in different resolutions ?
    No, not relevant at all, It's like measuring oil in linear foot, The compression artifacts are more visible in fast objects as motion blur, transitions and dark scenes are more blocky, sharp edges and lines are crushed due to higher compression, No one is claiming upscaled 480 is better than 480 itself, it's just youtube decided to work that way, Again just look at the examples I posted the difference is clear (besides the first example that doesn't have the aspect ratio flag).
    Quote Quote  
  25. Member dellsam34's Avatar
    Join Date
    Jan 2016
    Location
    Member Since 2005, Re-joined in 2016
    Search PM
    Originally Posted by lordsmurf View Post
    Originally Posted by dellsam34 View Post
    Here, tell me which one you want to watch, Both are my videos from the exact same crappy source (S-VHS recording):
    https://youtu.be/oOC71r5lBGk
    https://youtu.be/tbYhKvok5N4
    My eyes!
    That 2016 has bad AR, and interlace in progressive. Yuck.
    The 2016 sample does not have the aspect ratio flag and the 2nd sample is de-interlaced using QTGMC and upscaled to 1440x1080 square pixel, It just shows that when you feed a shitty video to youtube and let them handle the de-interlacing and compression vs when you properly de-interlace it yourself, set the aspect ratio and upscale it to avoid their aggressive compression forced on 480 files.

    The whole point of my first post is to show the OP how to properly set the aspect ratio without needing to add black bars in a widescreen environment but it got side tracked by an irrelevant discussion of upscaling and youtube compression algorithms.
    Quote Quote  
  26. Just for info: Adding black bars (padding) can be done with clever Ffmpeg-GUI also.
    Quote Quote  
  27. @dellsam34
    The whole point of my first post is to show the OP how to properly set the aspect ratio without needing to add black bars in a widescreen environment but it got side tracked by an irrelevant discussion of upscaling and youtube compression algorithms.
    Well the O.P. was last active on 8th May 2018... é_è
    Or you mean the one who (goes by the nickname) Knocks ?

    The 2016 sample does not have the aspect ratio flag and the 2nd sample is de-interlaced using QTGMC and upscaled to 1440x1080 square pixel, It just shows that when you feed a shitty video to youtube and let them handle the de-interlacing and compression vs when you properly de-interlace it yourself, set the aspect ratio and upscale it to avoid their aggressive compression forced on 480 files.
    If you applied a completely different pre-processing it's hardly a valid test (even if you happen to be right).

    No, not relevant at all, It's like measuring oil in linear foot, The compression artifacts are more visible in fast objects as motion blur, transitions and dark scenes are more blocky, sharp edges and lines are crushed due to higher compression, No one is claiming upscaled 480 is better than 480 itself, it's just youtube decided to work that way, Again just look at the examples I posted the difference is clear (besides the first example that doesn't have the aspect ratio flag).
    Not sure I understand the first part. You mean, in those example videos you mentioned ? I didn't download them to check their bitrate, but anyway, as noted above, you didn't upload strictly identical videos to begin with (there are more differences than merely the resolution and there are other potential explanations for the quality discrepancy than merely a different compression “agressiveness” for the lower resolution one), so not much can be concluded from that. It's like saying that the elephant in the room for improvement is bigger than the elephant in the oil-producing snake, who is not the same elephant, is not even a relative of said elephant, and obviously didn't get “uploaded” in the same way into the snake, because it takes a ladder to move an elephant into the room in the first place, and “Ladders give / Snakes take” (AC/DC, "Sin City"). (It's a 1034 days old thread so making sense is optional.)

    @butterw
    Bitrate for a given resolution is more meaningful than Bits/(Pixel*Image), which is only really useful when comparing similar non-16/9 resolutions or SD resolutions. At the time when this was added to mediainfo exotic AR/resolutions were common.
    It's preferable to stick to bitrate when comparing 720p vs 1080p. The standard bitrate vs resolution curve is available for various codecs/encoding settings. Constant Bits/(Pixel*Image) at different resolutions doesn't mean same quality.
    Alright then. So what would be the expected bitrate ratio between, say, 640x360, 854x480, 1280x720 and 1920x1080, to get a similar quality from the exact same source footage ? Is it a fact that higher resolutions require less bitrate per pixel to achieve the same level of quality preservation ?
    I just checked two videos I encoded with ffmpeg libx264 -crf 18, one in native 1920x1080, the other downscaled to 960x540 with “-vf "scale=iw/2:ih/2"” :
    – 1920x1080 => 12.4mbps, Bits/(Pixel*Image) = 0.119
    – 960x540 => 1981kbps, Bits/(Pixel*Image) = 0.076
    So here it's the other way around, the higher resolution version got proportionally more bitrate.
    Quote Quote  
  28. At higher output resolutions, you can get away with compressing more while maintaining subjective quality. Ex: 1080p has 2.25x the number of pixels of 720p, but encoding at 2x bitrate for instance should be OK. The effect will be more significant between 1080p and 4K (4x the pixels).

    I don't know if this applies to lossless / near lossless. In your example, the downscaling may have removed noise/grain making the content more compressible. It does seem like a big difference for the same source though.
    Quote Quote  
  29. Member dellsam34's Avatar
    Join Date
    Jan 2016
    Location
    Member Since 2005, Re-joined in 2016
    Search PM
    Originally Posted by abolibibelot View Post
    I just checked two videos I encoded with ffmpeg libx264 -crf 18, one in native 1920x1080, the other downscaled to 960x540 with “-vf "scale=iw/2:ih/2"” :
    – 1920x1080 => 12.4mbps, Bits/(Pixel*Image) = 0.119
    – 960x540 => 1981kbps, Bits/(Pixel*Image) = 0.076
    So here it's the other way around, the higher resolution version got proportionally more bitrate.
    We're talking about youtube encoding and compression not off line stuff that you do on your computer, I will never throw away my master 480i captured from tapes, but for youtube I'm forced to de-interlace and upscale to 1080p to get a decent visual experience for the viewers. You don't have to take my word for it but that's just how youtube works and that's how I was able to get around their quality grading.
    The numbers you're posting are irrelevant, just like saying MPEG-2 has more bitrate per pixel than h.264 but how come h.264 look better? Well the algorithms are different.
    Quote Quote  
  30. @butterw
    At higher output resolutions, you can get away with compressing more while maintaining subjective quality. Ex: 1080p has 2.25x the number of pixels of 720p, but encoding at 2x bitrate for instance should be OK. The effect will be more significant between 1080p and 4K (4x the pixels).
    I don't know if this applies to lossless / near lossless. In your example, the downscaling may have removed noise/grain making the content more compressible. It does seem like a big difference for the same source though.
    Don't know either. The difference in this case is that the rendered video (the source for both encodes) was indeed 1920x1080, so some detail was inevitably lost when downscaling to 960x540. Whereas, if starting with a 960x540 source, and encoding one version in 960x540, then another in 1920x1080 (with the exact same settings again), the upscaled version couldn't possibly have more detail ; I wonder how it would affect the resulting bitrates in this case. I could do a test but I'm probably not bored enough to do it right now, or I don't care enough about the outcome.

    @dellsam
    We're talking about youtube encoding and compression not off line stuff that you do on your computer
    Here I was replying to another consideration, distinct although adjacent, which is : what should be the expected bitrate ratio, when encoding the same video at different resolutions, to effectively get a similar quality ? Since x264 with the same CRF setting (and all other settings identical as well) is meant to produce a constant level of quality, then comparing the bitrates of its output for the same video at different resolutions should provide a good answer to that question.
    Then, going back to the previous subject, if the consensus is that YouTube doesn't allocate enough bitrate for lower resolutions videos, then how much more should it allocate for those videos to look as good as their upscaled counterparts ? Or is there more to it than just bitrate ?

    The numbers you're posting are irrelevant, just like saying MPEG-2 has more bitrate per pixel than h.264 but how come h.264 look better? Well the algorithms are different.
    Those numbers were obtained with the same encoder at the exact same settings, so this analogy is somewhat disingenuous.
    Last edited by abolibibelot; 29th Dec 2020 at 14:24.
    Quote Quote  



Similar Threads