I'm with you 100%, I'm already on record having said that I don't believe in even re-encoding a BD to a lower bit rate much less a lower resolution.
There is one caveat, the VP9 codec, it features a technology called "spatial resampling" where the encoder compresses a lower resolution version of the frame which is then upscaled by the codec to the correct presentation resolution; it's designed to improve quality at low bit rates. I have done test encodes with and without this enabled and it does seem that it works and quite well at that.
Think of it this way, assume you have a 2560x1440 source, i.e. a 2.5k source, you have 3 choices, you can use 8000kb/s to create a 2048x1080 master, 8000kb/s to create a 1920x1080 master or you can use 8000kb/s to create a 1440x1080 master, which one do you choose?
Try StreamFab Downloader and download from Netflix, Amazon, Youtube! Or Try DVDFab and copy Blu-rays! or rip iTunes movies!
+ Reply to Thread
Results 61 to 79 of 79
Thread
-
-
-
You are quite wrong on this point, if you were to use the exact same algorithm to upscale and downscale then yes, in theory, you should end up with the exact same picture but if you use a different algorithm to upscale then to downscale, as is often the case, then you would have had multiple generations of quality lose.
I also have no idea what you are trying to convey with "The source video doesn't have each and every pixel encoded. It's compressed.", the process of encoding is the process of compression, I don't quite get how you think using semantics helps your argument. -
Hey guys,
I encode to lower Bitrate and/or resolution because I want to. Period. End of Story. I do not have to (and will no longer) defend this position.
While I do this, I will continue to look for better, faster, more efficient ways (e.g., posting this thread) to accomplish what I think is the right size/quality ratio for my purposes.
In the future I may or may not continue to do this...free will is a wonderful thing (Maybe I'll re-encode everything to 5760p...who knows?) Either way, I'm pretty sure my kids (the future generation) won't think of me when I'm gone as "that bastard that encoded those movies at a lower resolution and ruined our childhood."Last edited by natebetween; 6th Feb 2015 at 06:54.
-
-
Your credibility on the subject of video quality has been downgraded to a zero rating, so it doesn't matter.
https://forum.videohelp.com/threads/369976-Color-correction?p=2372396&viewfull=1#post2372396 -
Well no kidding. It wasn't obviously a simple example of how an image can have a much higher resolution than it contains picture detail, or does it really require explaining?
Come back when you can provide a single example of your past claims in relation to CRF being inferior to "insert your preferred encoder settings here" at the same bitrate and I'll consider upgrading the status of your opinion to something higher than "not to be taken seriously", which had already been reduced several credibility levels after you claimed AQ mode 3 "makes a significant difference in quality".
I've posted an example of a picture downscaled to 720p and back again as proof of what I'm saying. Sometimes it's possible to downscale to 720p without a noticeable loss of detail (I think maybe in the example I linked to there was a very minor loss of fine detail, which is why I used it at the time, and I'd need to check it again myself to refresh my memory, but it still shows how little difference there can be between 1080p and 720p and that was the point I was making). The only reason I had to keep arguing about it was because newpball goes into ostrich mode and ignores anything that doesn't support his viewpoint. I could show screenshot after screenshot of upscaled video and if it contradicted his claims he'd just pretend it wasn't there.
So you don't believe in re-encoding to a lower bitrate or resizing to a lower resolution except when you're using the VP9 codec at low bitrates in which case the "spatial resampling" downscaling and upscaling is fine?Last edited by hello_hello; 6th Feb 2015 at 15:36.
-
Why? I liked to give that video a bleached colorful feel. So what? Pop videos now need to be color accurate? Another SOE standard I must have missed?
Why so sour, or is it bitter?
By the way, I seriously wonder, what monitor or TV do you have and how far on average do you sit from it?
-
No, but they also aren't supposed to look like crap either. The orange people made me wonder, but you topped it with that one.
No, it's just being tired of your selective arguing and ignoring what doesn't suit. That's all.
It's a 12" LCD 20 feet away.
No, it's a 160" Plasma 2 feet from my desk.
I don't understand why you'd question how I view video, when you won't look at posted examples and discuss them. What difference does it make? Is the effort required to ignore them somehow related to my viewing conditions?Last edited by hello_hello; 6th Feb 2015 at 22:25.
-
-
-
Not that I like arguing via a video thread, or the way this thread has gone...but to be fair, no, I did not ask an opinion on whether I should be encoding at a lower bitrate and/or resolution?
I asked, very simply, for people who DO encode their videos to save space, would they prefer a higher resolution at a lower quality, or a higher quality at a lower resolution for a given file size. That was the opinion I asked for. -
-
Yeah, but I'm still not sold on "noticeable loss of details in dark areas" being a rule as such.
Using my pics back in post #31 as an example, there really doesn't appear to be a loss of detail simply because the picture was dark. It's the first time I've tried something like that and to a certain extent it surprised me a little how much detail was retained, given under normal viewing conditions the picture's too dark to see a lot of it. How would you throw enough bits at your sample to encode it correctly because it's dark, while at the same time not using an excessive bitrate to encode the pics I posted?
Your sample seems to be more bit depth related (if you encode with the 10 bit encoder the posterization artefacts are reduced dramatically) so in that respect it's a similar issue to banding. At least the way I'm looking at it. I don't think anyone claims the x264 encoder is terrible at retaining detail in brighter scenes as such, yet it wouldn't be unusual for the sky to have banding in an outdoor scene, or for the picture to look awesome except for that bit of banding on the wall in the background etc, even when it's not particularly dark. The darker it is though , the more sensitive we might be to it. Anyway....
I guess maybe I'm looking at it from a different perspective but in my experience a "loss of details in dark areas" hasn't been an issue as such, but if it's happened I've looked at it as a "some video is harder to encode" problem and sometimes it's harder to encode when the picture is darker. Tell me if I'm completely wrong.
A question for you.....
I experimented with adding some dither to your sample when encoding.... just gradfun3() and it made very little visual difference.... but why is it sometimes, or more than sometimes, does gradfun3() reduce the bitrate a little for a given CRF value even if it also reduces the banding a little? I'd have thought by definition, dithering should always increase the bitrate.
Cheers. -
Jagabo's example shows how there are large differences in perceptions between people . This is the most common "complaint". By far. Some people are very sensitive to thse type of artifacts, others cannot even see them. Part of it is usually related to the display setup (drivers, calibration, monitor, software, renderer etc...) . Certainly, default settings are not "good enough" for many people on this sort of material
Dithering only helps, if your encoding settings are "good enough" to retain the dither. This example is 100% caused by the encode and the settings used, not the source. (More commonly , there are source issues as well that require filtering along with adjusting settings)
If you cannot see it but want to learn more about it, view some of your output encodes using histogram("luma") . Compare it to jagabo's source video with histogram("luma"). Just do it. It's a very instructive exercise. You will see clearly the quantization and mb edges . This is completely related to "banding " in daytime scenes, blue skies as well, shadow details as well. It IS the same artifact. You just might not be able to see it because of your display setup or settings. All encoders suffer from this sort of thing, especially ones that are tuned for PSNR. As expected, the BD source has a fine grain, dithering pattern. That is held at BD bitrates, no problems. You actually don't need to dither this , just retain the dither that is already there. If you're someone that cares about this sort of thing, you basically need higher AQ settings, lower deadzones if you are using CRF encoding. The bitrate will be much larger before you get what I would call a "near perfect" encode. Basically, if you cannot see the blocks with histogram("luma"), there is no way on any display setup that you will see those sorts of problems - that's a good guideline to strive for if you can see or care about these things. And if you don't care about these things, then just carry on and ignore -
I agree with Jagabo in that the scene in question doesn't re-encode easily and the artefacts aren't hard to spot. If I was encoding the entire video and happened to notice it at the time (with CRF18 it's not horribly "offensive" to me, at least not for such a short section) I'd probably be tempted to encode it at a much lower CRF value than the rest of the video in order to fix it.
So we're not in disagreement there, but maybe we have a different perspective when it comes to a "noticeable loss of details in dark areas".
Hopefully hardware capable of decoding 10 bit video will be mainstream sooner rather than later and a lot of that sort of problem will go away, although chances are we'll have to wait for HEVC to become mainstream for that to happen.
It's a bit frustrating because my Bluray player will happily output bit depths up to 16 bit and the TV doesn't seem to have a problem with it, but it's still only 8 bit decoding. Which has sometimes made me wonder.... what's the point of having a so called "HDMI deep colour" option when a player can only decode 8 bit video anyway? Marketing? -
Except a lower CRF alone won't "fix" it. Unless you go very low. You can go 4-5x the size and it still won't "fix" unless you adjust the other settings. It's not just x264 - it's the way most codecs are "weighted" and designed.
And not everyone can see or is bothered by these things. Yet for others, it's the end of the world.
Which has sometimes made me wonder.... what's the point of having a so called "HDMI deep colour" option when a player can only decode 8 bit video anyway? Marketing?