VideoHelp Forum
+ Reply to Thread
Page 3 of 3
FirstFirst 1 2 3
Results 61 to 79 of 79
Thread
  1. Originally Posted by newpball View Post
    I must say that I am shocked, people talking here about 'improving' the results by playing endlessly with codec options while at the same time they have no issue cutting the resolution of the video in half.

    A case of not seeing the forest from the trees!

    Stunning!

    I'm with you 100%, I'm already on record having said that I don't believe in even re-encoding a BD to a lower bit rate much less a lower resolution.

    There is one caveat, the VP9 codec, it features a technology called "spatial resampling" where the encoder compresses a lower resolution version of the frame which is then upscaled by the codec to the correct presentation resolution; it's designed to improve quality at low bit rates. I have done test encodes with and without this enabled and it does seem that it works and quite well at that.

    Think of it this way, assume you have a 2560x1440 source, i.e. a 2.5k source, you have 3 choices, you can use 8000kb/s to create a 2048x1080 master, 8000kb/s to create a 1920x1080 master or you can use 8000kb/s to create a 1440x1080 master, which one do you choose?
    Quote Quote  
  2. Originally Posted by hello_hello View Post
    And of course none of those codec options we're playing with encode individual pixels. They encode mathematical approximations of sections of the picture.
    I don't suppose you would like to retract this statement? This way we don't go off on another tangent.
    Quote Quote  
  3. Originally Posted by hello_hello View Post
    If I take a 720p picture and duplicate each pixel horizontally and then again vertically I get a picture with lots more pixels. Quadruple the resolution if I'm not mistaken, but exactly the same amount of picture detail. If I then reduce the resolution to 720p I'm back where I started, but how have I lost picture detail in the process?
    I don't know which part of a 1080p image not necessarily containing 1080p worth of picture detail you're unable to understand. Sure, maybe each and every time you resize a 1080p video down to 720p "something" is lost, but that doesn't necessarily mean it's picture detail you can see. The source video doesn't have each and every pixel encoded. It's compressed.
    You are quite wrong on this point, if you were to use the exact same algorithm to upscale and downscale then yes, in theory, you should end up with the exact same picture but if you use a different algorithm to upscale then to downscale, as is often the case, then you would have had multiple generations of quality lose.

    I also have no idea what you are trying to convey with "The source video doesn't have each and every pixel encoded. It's compressed.", the process of encoding is the process of compression, I don't quite get how you think using semantics helps your argument.
    Quote Quote  
  4. Hey guys,

    I encode to lower Bitrate and/or resolution because I want to. Period. End of Story. I do not have to (and will no longer) defend this position.

    While I do this, I will continue to look for better, faster, more efficient ways (e.g., posting this thread) to accomplish what I think is the right size/quality ratio for my purposes.

    In the future I may or may not continue to do this...free will is a wonderful thing (Maybe I'll re-encode everything to 5760p...who knows?) Either way, I'm pretty sure my kids (the future generation) won't think of me when I'm gone as "that bastard that encoded those movies at a lower resolution and ruined our childhood."
    Last edited by natebetween; 6th Feb 2015 at 06:54.
    Quote Quote  
  5. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by natebetween View Post
    I encode to lower Bitrate and/or resolution because I want to. Period. End of Story. I do not have to (and will no longer) defend this position.
    You can do whatever you want. But eh, you asked our opinion didn't you? So we gave it.

    Originally Posted by natebetween View Post
    Either way, I'm pretty sure my kids (the future generation) won't think of me when I'm gone as "that bastard that encoded those movies at a lower resolution and ruined our childhood."
    Well it is to be hoped that you treat your family videos more preciously.

    Quote Quote  
  6. Originally Posted by newpball View Post
    Originally Posted by natebetween View Post
    I encode to lower Bitrate and/or resolution because I want to. Period. End of Story. I do not have to (and will no longer) defend this position.
    You can do whatever you want. But eh, you asked our opinion didn't you? So we gave it.
    Your credibility on the subject of video quality has been downgraded to a zero rating, so it doesn't matter.
    https://forum.videohelp.com/threads/369976-Color-correction?p=2372396&viewfull=1#post2372396
    Quote Quote  
  7. Originally Posted by sophisticles View Post
    You are quite wrong on this point, if you were to use the exact same algorithm to upscale and downscale then yes, in theory, you should end up with the exact same picture but if you use a different algorithm to upscale then to downscale, as is often the case, then you would have had multiple generations of quality lose.
    Well no kidding. It wasn't obviously a simple example of how an image can have a much higher resolution than it contains picture detail, or does it really require explaining?

    Originally Posted by sophisticles View Post
    I also have no idea what you are trying to convey with "The source video doesn't have each and every pixel encoded. It's compressed.", the process of encoding is the process of compression, I don't quite get how you think using semantics helps your argument.
    Come back when you can provide a single example of your past claims in relation to CRF being inferior to "insert your preferred encoder settings here" at the same bitrate and I'll consider upgrading the status of your opinion to something higher than "not to be taken seriously", which had already been reduced several credibility levels after you claimed AQ mode 3 "makes a significant difference in quality".

    I've posted an example of a picture downscaled to 720p and back again as proof of what I'm saying. Sometimes it's possible to downscale to 720p without a noticeable loss of detail (I think maybe in the example I linked to there was a very minor loss of fine detail, which is why I used it at the time, and I'd need to check it again myself to refresh my memory, but it still shows how little difference there can be between 1080p and 720p and that was the point I was making). The only reason I had to keep arguing about it was because newpball goes into ostrich mode and ignores anything that doesn't support his viewpoint. I could show screenshot after screenshot of upscaled video and if it contradicted his claims he'd just pretend it wasn't there.

    Originally Posted by sophisticles View Post
    I'm with you 100%, I'm already on record having said that I don't believe in even re-encoding a BD to a lower bit rate much less a lower resolution.

    There is one caveat, the VP9 codec, it features a technology called "spatial resampling" where the encoder compresses a lower resolution version of the frame which is then upscaled by the codec to the correct presentation resolution; it's designed to improve quality at low bit rates. I have done test encodes with and without this enabled and it does seem that it works and quite well at that.
    So you don't believe in re-encoding to a lower bitrate or resizing to a lower resolution except when you're using the VP9 codec at low bitrates in which case the "spatial resampling" downscaling and upscaling is fine?
    Last edited by hello_hello; 6th Feb 2015 at 15:36.
    Quote Quote  
  8. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by hello_hello View Post
    Your credibility on the subject of video quality has been downgraded to a zero rating, so it doesn't matter.
    https://forum.videohelp.com/threads/369976-Color-correction?p=2372396&viewfull=1#post2372396
    Why? I liked to give that video a bleached colorful feel. So what? Pop videos now need to be color accurate? Another SOE standard I must have missed?

    Why so sour, or is it bitter?

    By the way, I seriously wonder, what monitor or TV do you have and how far on average do you sit from it?

    Quote Quote  
  9. Originally Posted by newpball View Post
    So what? Pop videos now need to be color accurate? Another SOE standard I must have missed?
    No, but they also aren't supposed to look like crap either. The orange people made me wonder, but you topped it with that one.

    Originally Posted by newpball View Post
    Why so sour, or is it bitter?
    No, it's just being tired of your selective arguing and ignoring what doesn't suit. That's all.

    Originally Posted by newpball View Post
    By the way, I seriously wonder, what monitor or TV do you have and how far on average do you sit from it?
    It's a 12" LCD 20 feet away.
    No, it's a 160" Plasma 2 feet from my desk.
    I don't understand why you'd question how I view video, when you won't look at posted examples and discuss them. What difference does it make? Is the effort required to ignore them somehow related to my viewing conditions?
    Last edited by hello_hello; 6th Feb 2015 at 22:25.
    Quote Quote  
  10. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by hello_hello View Post
    No, it's just being tired of your selective arguing and ignoring what doesn't suit. That's all.
    Perhaps, and this is just a friendly suggestion, you should try to take things a little less personally in friendly conversations over the internet.

    Originally Posted by hello_hello View Post
    No, it's a 160" Plasma 2 feet from my desk.
    Well, at least you won't need a heater in the winter!
    Quote Quote  
  11. Originally Posted by newpball View Post
    Originally Posted by hello_hello View Post
    No, it's just being tired of your selective arguing and ignoring what doesn't suit. That's all.
    Perhaps, and this is just a friendly suggestion, you should try to take things a little less personally in friendly conversations over the internet.
    Not taking someone seriously and taking something personally are two completely different things. I've no idea how you've managed to confuse the two.
    Quote Quote  
  12. jagabo,
    If you're still around did you see my post here?
    Just wondering what your thoughts might be.
    Quote Quote  
  13. Originally Posted by newpball View Post
    Originally Posted by natebetween View Post
    I encode to lower Bitrate and/or resolution because I want to. Period. End of Story. I do not have to (and will no longer) defend this position.
    You can do whatever you want. But eh, you asked our opinion didn't you? So we gave it.
    Not that I like arguing via a video thread, or the way this thread has gone...but to be fair, no, I did not ask an opinion on whether I should be encoding at a lower bitrate and/or resolution?

    I asked, very simply, for people who DO encode their videos to save space, would they prefer a higher resolution at a lower quality, or a higher quality at a lower resolution for a given file size. That was the opinion I asked for.
    Quote Quote  
  14. Originally Posted by hello_hello View Post
    Originally Posted by jagabo View Post
    You need a dark noisy shot with shallow gradients. It's also easiest to see on standard definition material since it gets enlarge more when you play it full screen. Here's a sample AVI with UT video codec (BicubicResize() from a Blu-ray source). Even at the medium preset at CRF 18 you'll see obvious posterization artifacts. The effect of aq-mode=3 on reducing that posterization isn't huge.
    I had a play and I see what you mean, however in respect to "noticeable loss of detail in dark areas".......

    If you increase the gamma or brightness and then re-encode it, I'm not sure the result is any different. The posterization artefacts are still there. As the brightness increases they may get harder to see, or they're less obvious, but cranking up the brightness doesn't cause the encoder to "retain more detail" as such.
    If anything it indicates to me there's always going to be some video that's particularly hard to encode and when it's darker the artefacts might be more noticeable, but it hasn't convinced me to sign up for a membership of the "noticeable loss of details in dark areas" club just yet.
    Isn't the fact that it's more noticeable in dark areas the point?
    Quote Quote  
  15. Originally Posted by jagabo View Post
    Isn't the fact that it's more noticeable in dark areas the point?
    Yeah, but I'm still not sold on "noticeable loss of details in dark areas" being a rule as such.

    Using my pics back in post #31 as an example, there really doesn't appear to be a loss of detail simply because the picture was dark. It's the first time I've tried something like that and to a certain extent it surprised me a little how much detail was retained, given under normal viewing conditions the picture's too dark to see a lot of it. How would you throw enough bits at your sample to encode it correctly because it's dark, while at the same time not using an excessive bitrate to encode the pics I posted?

    Your sample seems to be more bit depth related (if you encode with the 10 bit encoder the posterization artefacts are reduced dramatically) so in that respect it's a similar issue to banding. At least the way I'm looking at it. I don't think anyone claims the x264 encoder is terrible at retaining detail in brighter scenes as such, yet it wouldn't be unusual for the sky to have banding in an outdoor scene, or for the picture to look awesome except for that bit of banding on the wall in the background etc, even when it's not particularly dark. The darker it is though , the more sensitive we might be to it. Anyway....

    I guess maybe I'm looking at it from a different perspective but in my experience a "loss of details in dark areas" hasn't been an issue as such, but if it's happened I've looked at it as a "some video is harder to encode" problem and sometimes it's harder to encode when the picture is darker. Tell me if I'm completely wrong.

    A question for you.....
    I experimented with adding some dither to your sample when encoding.... just gradfun3() and it made very little visual difference.... but why is it sometimes, or more than sometimes, does gradfun3() reduce the bitrate a little for a given CRF value even if it also reduces the banding a little? I'd have thought by definition, dithering should always increase the bitrate.

    Cheers.
    Quote Quote  
  16. Gradfun3() can use a dither pattern that doesn't change from frame to frame.
    Quote Quote  
  17. Jagabo's example shows how there are large differences in perceptions between people . This is the most common "complaint". By far. Some people are very sensitive to thse type of artifacts, others cannot even see them. Part of it is usually related to the display setup (drivers, calibration, monitor, software, renderer etc...) . Certainly, default settings are not "good enough" for many people on this sort of material

    Dithering only helps, if your encoding settings are "good enough" to retain the dither. This example is 100% caused by the encode and the settings used, not the source. (More commonly , there are source issues as well that require filtering along with adjusting settings)

    If you cannot see it but want to learn more about it, view some of your output encodes using histogram("luma") . Compare it to jagabo's source video with histogram("luma"). Just do it. It's a very instructive exercise. You will see clearly the quantization and mb edges . This is completely related to "banding " in daytime scenes, blue skies as well, shadow details as well. It IS the same artifact. You just might not be able to see it because of your display setup or settings. All encoders suffer from this sort of thing, especially ones that are tuned for PSNR. As expected, the BD source has a fine grain, dithering pattern. That is held at BD bitrates, no problems. You actually don't need to dither this , just retain the dither that is already there. If you're someone that cares about this sort of thing, you basically need higher AQ settings, lower deadzones if you are using CRF encoding. The bitrate will be much larger before you get what I would call a "near perfect" encode. Basically, if you cannot see the blocks with histogram("luma"), there is no way on any display setup that you will see those sorts of problems - that's a good guideline to strive for if you can see or care about these things. And if you don't care about these things, then just carry on and ignore
    Quote Quote  
  18. I agree with Jagabo in that the scene in question doesn't re-encode easily and the artefacts aren't hard to spot. If I was encoding the entire video and happened to notice it at the time (with CRF18 it's not horribly "offensive" to me, at least not for such a short section) I'd probably be tempted to encode it at a much lower CRF value than the rest of the video in order to fix it.

    So we're not in disagreement there, but maybe we have a different perspective when it comes to a "noticeable loss of details in dark areas".

    Hopefully hardware capable of decoding 10 bit video will be mainstream sooner rather than later and a lot of that sort of problem will go away, although chances are we'll have to wait for HEVC to become mainstream for that to happen.
    It's a bit frustrating because my Bluray player will happily output bit depths up to 16 bit and the TV doesn't seem to have a problem with it, but it's still only 8 bit decoding. Which has sometimes made me wonder.... what's the point of having a so called "HDMI deep colour" option when a player can only decode 8 bit video anyway? Marketing?
    Quote Quote  
  19. Originally Posted by hello_hello View Post
    I agree with Jagabo in that the scene in question doesn't re-encode easily and the artefacts aren't hard to spot. If I was encoding the entire video and happened to notice it at the time (with CRF18 it's not horribly "offensive" to me, at least not for such a short section) I'd probably be tempted to encode it at a much lower CRF value than the rest of the video in order to fix it.
    Except a lower CRF alone won't "fix" it. Unless you go very low. You can go 4-5x the size and it still won't "fix" unless you adjust the other settings. It's not just x264 - it's the way most codecs are "weighted" and designed.

    And not everyone can see or is bothered by these things. Yet for others, it's the end of the world.

    Which has sometimes made me wonder.... what's the point of having a so called "HDMI deep colour" option when a player can only decode 8 bit video anyway? Marketing?
    Yes, marketing . More tick boxes and feature points.
    Quote Quote  
Visit our sponsor! Try DVDFab and backup Blu-rays!