So I have been encoding for a little over a year now, and it has come to my attention that I might have been overlooking something. Just so you know, I use MeGUI and x264 for all my videos. They are mostly all 23.976 fps and I also have my own custom settings that include 5 Reference Frames (this info will be important later when talking about Encoding Levels).
It was explained to me a while back that it has become the norm to encode 1920x1080 Blu-Ray movies to 1920x800 without any real loss because when you crop out the Black Bars from the bottom and top of the screen, you are simply left with the video itself which is essentially 1920x800. So I have been following this method with all of my 1080p Movies.
I have NOT noticed any problems with "stretching" in any of my movies, but that is most likely thanks to my Excellent TV which appears to automatically correct the picture for me so it fits perfectly on my screen with no stretching effect.
However, when I view the encoded 1920x800 movies on my computer (simply to TEST them after encoding but NOT to actually watch them), I SOMETIMES notice the "Stretching" effect. I typically just ignored it because it still looked Perfect on my TV, which is where I always watch my encodes anyway. Considering this fact, I can probably just continue doing exactly what I am doing and never experience any problems thanks to my TV.
BUT, If I ever chose to actually watch my encoded movies on my Computer, or a tablet, a phone or even another TV, that "stretching effect" MIGHT present a problem for me. I dont know.
So I looked into the issue and realized that I should probably start paying attention to a movie's Actual "Aspect Ratio" and determining what each movie's dimensions should be individually as apposed to simply encoding them all to 1920x800.
I think part of my problem was that I thought "All Black Bars Were Created Equally". Meaning I thought that every blu-ray movie was actually 1920x800 in size and the black bars at the top and bottom of the screen were always 1920x140 each (top bar = 1920x140 + Movie = 1920x800 + bottom black bar = 1920x140 Total = 1920x1080). This was the mistake which I came to believe, based on the idea that it was the "norm" to encode movies to 1920x800.
From the research I have done, it looks all these "Black Bars" vary in height based on the actual movie's Aspect Ratio and if I want to encode my movies properly then I need to take that into account.
For example:
If a movie is 1920x1080 with an Aspect Ratio of 2.35:1 then the actual dimensions should be between 1920x816 - 1920x818, so I can Probably get away with encoding this movie to 1920x800 because it is VERY close to the actual dimensions and any "stretching effect" would be extremely small and you could most likely barely even notice it. Is that right?
However in this example it is completely different:
A Movie is 1920x1080 with an Aspect Ratio of 1.85:1. Using an "Aspect Ratio Calculator" I can determine that the actual dimensions of the movie should be somewhere between 1920x1036 - 1920x1040, and the rest is the black bars. So I would assume that I should encode my movie to those dimensions instead, because 1920x800 would cause that "stretching effect". Is that right?
If so, then I can easily do that. However it also prevents me with a NEW problem. The setting that I use to encode with include using "5 Reference Frames" (which I am VERY happy with and dont want to change). With 5 Ref Frames and dimensions Bigger than 1920x800 using x264, your encode ends up at a "Level 5.0" or higher. With the exception of computers, most electronic devices (Including the TV I have) will NOT play a movie that is encoded to a level above "4.2".
Now I do have another piece of Software that can Manually change the Level of the encoded video, and I have used it many time to trick my TV into thinking a Level 5.0+ Video was really Level 4.1. It does work on my TV, but there is no guarantee that it will Trick ALL Electronic devices I may eventually chose to play my videos on. Especially since the software obviously does not really change the level, it just tricks it into thinking it was encoded at a lower level then it really was.
So, that is another big reason I am still encoding at 1920x800 with 5 Ref Frames because pretty much all my movies result in a Level 4.0 which is fine for me, and again, they all still look Perfect when played on my TV.
On one hand:
-My 1920x800 Movies are all @ Level 4.0, so they should be compatible with nearly all devices.
-My 1920x800 Movies Look Perfect when played on my personal TV, which is the only place I currently watch them.
-Using Software to "Trick" Videos into thinking they were encoded at Level 4.1 when they were really encoded at level 5.0+ might NOT trick ALL devices into playing them.
On the Other Hand:
-I should probably be encoding video to their proper Aspect Ratio anyway.
-All devices might not Automatically Correct the picture for me like my TV Does, so some of my 1920x800 movies might looked "stretched" when played on another device.
-The Software I use to "Trick" videos into thinking they are a Lower Level DOES seem to work fine on my TV.
-I could always lower the amount of Reference Frames I use on my videos in order to maintain a Lower Level (but I REALLY DONT WANT TO).
-If my TV ever breaks or gets replaced, again, my next one might not do as good of a job of correcting the picture for me.
-If I change to encoding based on Aspect Ratio, I will most likely feel obligated (to myself) to go back and re-encode every movie I have done for last year, which would be a LOT of work.
So Ultimately, I would like to FIRST confirm that my research is CORRECT based on my examples above regarding encoding by Aspect Ratio. Secondly, I would like to hear some opinions on if I should continue to encode my movies at 1920x800 or if I should start encoding them based on their individual Aspect Ratio's instead ....... based on my current circumstances that I mentioned.
Sorry for the LONG post, so THANK YOU in advance for Reading it all hopefully Responding.
+ Reply to Thread
Results 1 to 27 of 27
-
Last edited by manofsteel31; 15th Mar 2015 at 05:29.
-
Your main error is to put any value in the ratio between width and height of the "visibly interesting" part of the video content. But it has the least meaning among all aspect ratios one would tell about a video.
The most important "aspect ratio" is the question: Is the encoded material anamorphic (means: does the player need to deskew the decoded content at all)? HD video is either not anamorphic at all (sample aspect ratio = 1:1, most usually) or squeezed in its width by 4:3 (usually only for material recorded by some AVCHD cameras). For DVD Video, there are similar few and standardized ratios; their content will always be anamorphic, more or less obviously.
The second important aspect ratio is the one of the deskewed video in the player. All commercially produced videos for home entertainment will fit in rectangles with a Display Aspect Ratio of either 16:9 or 4:3; the "visually interesting" part may have to be stuffed with some border to fill up this area.
After you cropped off any border from the "visually interesting" content, you lost any reliable standard. No matter which aspect ratio you may read on the cover of your media, don't count on it. It will most probably not be exact enough to calculate with it. Sometimes it may even be plain wrong.
On a Blu-ray disk, videos will not be anamorphic but have a SAR of 1:1. Never try to stretch or squeeze them. If you believe they have to be cropped or not, to a multiple of 8 or 16, that's your decision. But avoid deskewing where no deskewing is necessary.Last edited by LigH.de; 15th Mar 2015 at 05:39.
-
Just set the player to keep the aspect ratio. Ie, not to stretch.
Not all movies are 2.35:1 aspect ratio. Some are 1.85:1, 1.78:1, 1.37 and others. If you crop all of those to 1920x800 you will be cutting off part of the picture.
Encode with fewer reference frames if the result is going to exceed the blu-ray profile@level spec.
You should avoid doing this because, as you suspect, it only changes the profile@level values in the header. The video will still have 5 reference frames and some players will choke when they hit that. Just set x264 to encode with 4 reference frames "--ref=4". Also check for other settings that might exceed blu-ray specs, max bitrate, max consecutive b-frames, vbv-maxrate, vbv-bufsize, etc.
The good news is that most modern devices aren't limited to the Blu-ray spec. So slightly exceeding it is rarely a problem.Last edited by jagabo; 15th Mar 2015 at 06:43.
-
But if I read you right, you want the best of both worlds. 1. To keep the AR (no bars) and 2. to play in full screen (no stretching or cropping). That, put simply, can not happen
If your tv is displaying that 1920*816 video at the correct AR with no stretching or bars then it must be zooming in so you lose image left and right.
Same goes with an encoded 1920*1080 unless that is the exact AR of the video. There should be some bars or, again, the image is cropped. -
Why not just encode everything at the HD standard 1920x1080 knowing that any black bars eat up essentially no bitrate and the video will display properly on all devices.
-
-
As was said, movies do not have the same aspect ratio, so height of those black bars is not the same for every movie.
1. If you do not know what todo just leave it 1920x1080, that is safe bet. Advantage - subtitles are mostly in black area not in the picture (VobSub or PGS, not srt, srt gets on the bottom of display all the time), and you can play these in Sony PS4 etc., ..., except that, negatives only left from my point of view.
2. To get square pixel encoded, making it ready for any device screen shape, without black bars, you'd need some kind of autocrop function in encoding software, but that function should be kind of smart, for example not cropping black bars less than 10pixels or so movie resolutions do not get messed up. Does Handbrake uses feature like that? Perhaps not.
3. Advanced approach is cropping manually. Using Avisynth for example, but Handbrake cannot load Avisynth script, so you would need to abandon Handbrake doing that, where you could see what to just crop, pixel accurate, and then you just crop one pixel more to get even number of pixels or some more using modes of your choice getting it ready for encoding - mode 4, mode 8 even mode 16, (that means height can be divided by that number getting whole number, not a fraction) but in that case you could crop a lot in live picture a lot, using mode 16 for example.
Or you could still use Handbrake giving it those autocropped data that you got from Avisynth script. If it is too advanced, it quite is, maybe you just use your PC screen, the height is 1080, you measure height for active picture and rest is simple math to get approximate number of pixels to keep. But that needs to straighten up your stretching video issue that you seem to have. You'd need to play your videos proportionally correct.
Sometimes there could be even like 2 pixels on sides and autocrop would cut even sides of picture resulting getting 1916 horizontal resolution. Using some automated program or script to autocrop. So it should be disabled somehow for movies or set a minimum limit in pixels for cropping. It has to be scripted to deal with stuff like that. At least that is what I had to do to get safe results, using batch scripts using Autocrop.dll and Avisynth in scripts encoding Blu-Ray.Last edited by _Al_; 15th Mar 2015 at 10:05.
-
The process is really simple, if you want to cut bars:
1. Convert your source to a non anamorphic source (e.g. PAR and DAR = 1:1)
2. Cut!
This always works and keeps things simple!
However doing a proper conversion is critical.
For anamorphic widescreen never, ever, cut vertical resolution, this is the common mistake people who don't know what there are doing do all the time.
For other PAR/DAR flavors, always expand the squeezed pixels, never destroy resolution!
Unfortunately unlike DVDs Blu-Ray do not come in anamorphic widescreen flavors (ask brilliant engineers why).
Last edited by newpball; 15th Mar 2015 at 10:36.
-
-
-
-
-
And how do you propose to get graphics on screen, positioned always as was intended ? If you do not box it (video, graphics, subtitles) you open Pandora's box of folks seeing weird stuff on their screens, being at mercy of their player. Thing are not that simple. Have you ever done DVD with subtitles or graphics? This is the whole point, I think, so things are being seen as intended, no compromises
, they gotta make sure, everyone can see the same thing.
-
-
-
-
Actually that whole cropping thing should obviously be handled at the codec meta level. I don't know about others but I find it idiotic we have to reencode a source if we only want to do simple operations like cropping.
A good example how in my opinion those things should be handled is how Adobe Camera Raw and Adobe Lightroom does things for images. The source never gets updated only operations on the source are added and executed real time at render time. Cropping, color correcting, gamma shifts, rotation, overlays, all those things could be operationally defined/adjusted at render time, no need for reencoding.
I know lateral thinking is a danger to red tape loving engineers.
Last edited by newpball; 15th Mar 2015 at 14:50.
-
Last edited by newpball; 15th Mar 2015 at 14:47.
-
And subtitles will be handled on what level then? Are you aware that they are part of the picture, they are images. You'd need to handle that as well ...
This subtitle size becomes perhaps a challenge now, ......., imagine 4k stuff and watching it on really big screen and sitting really close, this suppose to be a reason why we have 4k basically, isn't it, one would need to make subtitle a bit smaller. So would they come up with new standard to have subtitles and video handled independently ? Cropping and letterboxing could make its way out. But I doubt that it is going to change ...
Anyway no concept is going to be changed because of possible ripping problem, it has to be something else.
-
-
-
-
It's not that much, playing a video is not that compute intensive and most of the time is actually spent doing the decoding. A lot of players can already zoom and change luma and color information at real time. And overlay they obviously already do with subtitling. By using active metadata the whole process would simply be formalized.
If you apply a few filters in your favorite NLE and view the video you are doing exactly the same.
Are you telling me that a modest computer could not handle a luma/chroma and crop filer at real time?
Last edited by newpball; 15th Mar 2015 at 17:07.
-
The MKV spec includes cropping elements.
Now if only you could reduce the file size at render time........
The way you waffle on about engineers with every post really makes it seem like you're not playing with a full deck. -
Similar Threads
-
pioneer blu ray burner -want to make 1:1 copies of my blu ray movies
By Cool joe in forum MacReplies: 10Last Post: 10th Dec 2014, 21:31 -
Two Blu-Ray Rips: Why Same Playback Aspect Ratio But Different MediaInfo?
By LouieChuckyMerry in forum Blu-ray RippingReplies: 16Last Post: 1st Mar 2014, 11:34 -
Aspect Ratio problems on Blu-ray
By sdihome in forum Newbie / General discussionsReplies: 9Last Post: 28th Oct 2013, 18:40 -
Can Streaming Match Actual Blu Ray Discs?
By aaronoafc in forum Video Streaming DownloadingReplies: 1Last Post: 10th Jul 2012, 14:29 -
Easiest, fastest way to convert MKV to Blu Ray with 1080p aspect ratio
By jfk8680 in forum Authoring (Blu-ray)Replies: 3Last Post: 12th Apr 2010, 23:07