VideoHelp Forum

Our website is made possible by displaying online advertisements to our visitors. Consider supporting us by disable your adblocker or Try ConvertXtoDVD and convert all your movies to DVD. Free trial ! :)
+ Reply to Thread
Page 2 of 3
FirstFirst 1 2 3 LastLast
Results 31 to 60 of 79
Thread
  1. Originally Posted by jagabo View Post
    I made a table a few years ago (using mediainfo to extract the list of settings)
    Cheers!
    Although technically if it's a list of defaults I guess CRF should be 23.

    I spent a bit of time looking through my previous test encodes. To the point where I've no idea what I'm looking at any more. I think before I run any more testing I'm going to need to find a video to encode that exhibits a noticeable loss of detail in dark areas because at the moment I'm struggling to find an example.

    Given I spent a bit of time encoding and comparing I thought I'd post an example of the result. Maybe I'm missing something and someone will be able to tell me what it is. Below are some screenshots of a dark scene. Each encode screenshot was saved as a bitmap, then I boosted the gamma for a good look at what was encoded and what wasn't. In the interest of full disclosure the "original" is itself an encode, although a fairly high quality one. And there's a slight gamma/brightness difference between the original and the encodes. That seems to be because I saved the original by opening it in a script and for some reason MPC-HC was displaying it slightly differently. I didn't bother investigating. They're all saved as jpeg (100% quality) as for the purpose of this exercise I thought lossless was probably overkill.

    Original vs CRF18 --aq-mode 1 vs CRF23, all three aq modes.

    Original
    Click image for larger version

Name:	original.jpg
Views:	192
Size:	273.3 KB
ID:	30025

    Original with boosted gamma
    Click image for larger version

Name:	original2.jpg
Views:	224
Size:	473.1 KB
ID:	30026

    CRF 18 --aq-mode 1
    Click image for larger version

Name:	18aq1.jpg
Views:	215
Size:	451.3 KB
ID:	30027

    CRF 23 --aq-mode 1
    Click image for larger version

Name:	23aq1.jpg
Views:	235
Size:	419.7 KB
ID:	30028

    CRF 23 --aq-mode 2
    Click image for larger version

Name:	23aq2.jpg
Views:	243
Size:	413.2 KB
ID:	30029

    CRF 23 --aq-mode 3
    Click image for larger version

Name:	23aq3.jpg
Views:	218
Size:	426.2 KB
ID:	30030

    Most of the time I thought the various AQ modes just caused the video to be encoded slightly differently, but not necessarily better or worse. Sometimes for dark scenes though, as per the example above (tell me if I'm wrong) --aq-mode 3 may have a slight edge over --aq-mode 1 (although in some areas it's better, others it's worse) and --aq-mode 2 comes last, but that's reflected in the bitrates too. And I'm not convinced for brighter scenes --aq-mode 1 doesn't look better.

    And the differences above are going to be hard to spot when watching a movie. I'm still trying to find an example of a dark scene which is noticeably lower quality under normal viewing conditions. Or maybe my TV is way off being correctly calibrated......

    I probably should get motivated to repeat the above at a fixed bitrate but given --aq-mode 3 hasn't excited me all that much yet and I generally use CRF encoding.....
    Last edited by hello_hello; 4th Feb 2015 at 20:15.
    Quote Quote  
  2. Originally Posted by sophisticles View Post
    Based on your tests with different CRF values and the resulting bit rate differences a prudent person would conclude that aq-mode 3 takes the old "no replacement to displacement" approach to increasing quality in dark areas, by specifically using more bit rate in those areas.
    True, but if mode 2 improves on mode 1 by distributing the bits better for the same perceived quality at a lower bitrate, and mode 3 does the same, only not to the detriment of dark areas, I'd expect the bitrate resulting from mode 3 still wouldn't exceed that of mode 1 for a given CRF value. Or at least not by much on average.
    Whether that's logical or not I don't know, but so far I'm not seeing any major quality improvement due to --aq-mode 3 despite the fairly large increase in bitrate (at least at low CRF values).

    Originally Posted by sophisticles View Post
    It's as I've always said: the 2 most important factors in the quality of an encode is the quality of the source and how much bit rate is used, everything else is just details.
    I'd disagree because I think the ultimate definition of a quality encode is one that looks identical to the source, no matter what the quality of the source might be. In other words, how accurate it is. I could argue though, that for a very high quality source even minor encoding artefacts are going to be easily spotted, whereas for a lower quality, noisier source, it'd be less critical. Therefore the higher the quality the source the harder it might be to obtain a high quality encode.

    But yes, I'd tend to agree, the quality of the encode is fairly bitrate dependant and how you get there may not necessarily be critical..... whether you tweak, nudge setting A, disable setting B, or lower the CRF value etc. Maybe "tweaking" is more critical in a fixed bitrate environment. I remember saying something similar in another thread recently.

    I just noticed, looking at the other thread, Mephesto has been banned. Anyone know the gossip there?
    Last edited by hello_hello; 4th Feb 2015 at 19:55.
    Quote Quote  
  3. Originally Posted by hello_hello View Post
    I just noticed, looking at the other thread, Mephesto has been banned. Anyone know the gossip there?
    Maybe because every other word was the "F" bomb or "S" bomb? Don't know, just guessing, maybe because he pretty much admitted he belongs to a "scene" group that encodes and releases pirated versions of movies?

    True, but if mode 2 improves on mode 1 by distributing the bits better for the same perceived quality at a lower bitrate, and mode 3 does the same, only not to the detriment of dark areas, I'd expect the bitrate resulting from mode 3 still wouldn't exceed that of mode 1. Or at least not by much on average.
    Whether that's logical or not I odn't know, but so far I'm not seeing any major quality improvement due to --aq-mode 3 despite the fairly large increase in bitrate (at least at low CRF values).
    I'm not sure that mode 2 does in fact improve on mode 1; I recall reading posts by the 2 main developers a while back where it was stated that it was still up for debate as to whether or not aq mode 2 was better than aq mode 1; personally I'm not even convinced that trellis 2 is better than trellis 1.

    But since aq mode 3 is supposed to address the problems x264 has with blacks and dark scenes then I'm assuming it simply bumps the bit rate in those areas, almost like a very target "zones" patch.

    I'd disagree because I think the ultimate definition of a quality encode is one that looks identical to the source, no matter what the quality of the source might be. In other words, how accurate it is. I could argue though, that for a very high quality source even minor encoding artefacts are going to be easily spotted, whereas for a lower quality, noisier source, it'd be less critical. Therefore the higher the quality the source the harder it might be to obtain a high quality encode.
    I take the exact opposite view, the thing with very high quality sources is that minor encoding artifacts never appear in the first place, because there's so much detail that some can be thrown away sans any artifacting, especially at decent bit rate.

    But I recently did a slew of encoding tests with very high quality y4m sources and noticed a number of interesting things:

    If you want the absolute highest quality encode with x264 you go with the "placebo" preset, you shouldn't even bother trying to use any other preset or custom settings, as measured by both PSNR and SSIM, placebo results in values that reach the "mastering" quality level.

    There are some sources where it's very easy to achieve a high quality encode, even with relatively little bit rate and then there are sources that despite being similar content wise are very hard to impossible to get a high quality encode without sending the bit rate skyrocketing. 2 examples are the Sintel trailer, an extremely easy to encode source with low bit rate and achieve a PSNR of 50db and an SSIM value of .995 but Elephant's Dream, especially that one sequence that starts at about the 2 minute mark, is impossible to encode at those values unless you use more bit rate than even I would think is too much for that resolution.

    x265, walks all over x264, once you start cranking up the settings. Interesting side note with x265, if you crank up all the quality settings, such as setting sub pixel to 7 and me to star but leave RDO at 0 and compare it to cranking up RDO to 5 but lowering all the other settings to as low as possible, you get nearly the exact same quality as measured by PSNR, SSIM and your eyes. In fact, the single setting in x265 that most effects encoding speed is RDO and it also seems to have the biggest impact on quality. If you max out the other settings and then start incrementally increasing RDO you find that the quality jumps noticably but encoding speed nosedives.

    VP9 is awesome, it's too bad that it's so slow, if you have the patience try a few test encodes with VP9 and the quality settings cranked up, you won't believe your eyes. I remember hearing rumors a while back about an xvp9 encoder, if someone created a VP9 encoder with the x264 treatment, i.e. lots of assembler optimizations, I would use that almost exclusively.

    As for the CRF encodes you had wanted, I did do some but they proved to be inconclusive, some encodes did in fact come out slightly better with CRF but that's because the bit rate was all over the place. I much prefer controlling the bit rate and file size than letting some program decide for me how much bit rate it thinks it should use.
    Quote Quote  
  4. You need a dark noisy shot with shallow gradients. It's also easiest to see on standard definition material since it gets enlarge more when you play it full screen. Here's a sample AVI with UT video codec (BicubicResize() from a Blu-ray source). Even at the medium preset at CRF 18 you'll see obvious posterization artifacts. The effect of aq-mode=3 on reducing that posterization isn't huge.
    Image Attached Files
    Quote Quote  
  5. Originally Posted by sophisticles View Post
    I take the exact opposite view, the thing with very high quality sources is that minor encoding artifacts never appear in the first place, because there's so much detail that some can be thrown away sans any artifacting, especially at decent bit rate.
    To a certain extent I've given up trying to predict it. Some fairly high quality sources seem prone to banding, while others not so much. I don't think there's a hard and fast rule.

    Originally Posted by sophisticles View Post
    There are some sources where it's very easy to achieve a high quality encode, even with relatively little bit rate and then there are sources that despite being similar content wise are very hard to impossible to get a high quality encode without sending the bit rate skyrocketing.
    That's the sort of thing I mean.

    Originally Posted by sophisticles View Post
    As for the CRF encodes you had wanted, I did do some but they proved to be inconclusive, some encodes did in fact come out slightly better with CRF but that's because the bit rate was all over the place. I much prefer controlling the bit rate and file size than letting some program decide for me how much bit rate it thinks it should use.
    Well "controlling the bitrate yourself" is somewhat of a contradiction of your statement above if the object of the exercise is a particular quality relative to the source.

    I really don't know how the CRF encodes could prove inconclusive. The debate was whether x264's CRF is better or worse than "insert your preferred encoder here" at a given bitrate. The rest of us are of the opinion CRF and x264's 2 pass produce the same quality at the same bitrate, so it wouldn't matter how you did it. CRF encode first, vs 2 pass encode using "insert your preferred encoder here" at the same bitrate, or x264 2 pass encoding vs "insert your preferred encoder here" 2 pass encoding at the same bitrate.
    The "at the same bitrate" part is what we were debating and that's easily done.
    Last edited by hello_hello; 4th Feb 2015 at 21:05.
    Quote Quote  
  6. Originally Posted by jagabo View Post
    Here's a sample AVI with UT video codec (BicubicResize() from a Blu-ray source). Even at the medium preset at CRF 18 you'll see obvious posterization artifacts. The effect of aq-mode=3 on reducing that posterization isn't huge.
    Cheers. I'll give it a spin later. The real world beckons at the moment....
    Quote Quote  
  7. hello_hello, jigabo, and sophisticles....thanks for all of the great information. You've taken what I had just started dabbling in and you've blown my mind. I see I have more testing to do, and more tweaking to play with.

    I'm thinking Gravity might be a good option to play with...as there is a lot of black. I'm also going to be playing around with the Desolation of Smaug. I'm still playing around with the RF 19, 18, and even 17 values, at 720p. Today marks the first day I've played with the x264 presets, other than Medium. Going to mess around with slow and slower to see what the differences are. Hoping that the chart provided by jigabo is correct still, since I'd like to know what I'm applying. At this rate, it sounds like an exercise in frustration to start messing with the aq-mode...since there is no clear great increase in quality of dark scenes.
    Quote Quote  
  8. I've got 2 quick questions for you.

    Is there a difference from encoding from a full BD rip with all of the files and BD structure, vs from the MKV you make with MakeMKV? (Do I automatically lose some quality when I use MakeMKV?)

    Also, do you find it worthwhile to use the x264 tune options for film and animation?

    Ok...that's 3 questions. Thanks.
    Quote Quote  
  9. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    I must say that I am shocked, people talking here about 'improving' the results by playing endlessly with codec options while at the same time they have no issue cutting the resolution of the video in half.

    A case of not seeing the forest from the trees!

    Stunning!

    Quote Quote  
  10. Originally Posted by newpball View Post
    I must say that I am shocked, people talking here about 'improving' the results by playing endlessly with codec options while at the same time they have no issue cutting the resolution of the video in half.

    A case of not seeing the forest from the trees!

    Stunning!

    Actually...It's more the result of finding the acceptable "quality per GB ratio". Cutting the resolution in half, as has been shown in this thread and others, often results in a picture that is not discernible from the original to all but crazy videophiles. More importantly, the storage cost is cut dramatically.

    It's about finding the sweet spot, which is clearly going to be very personal. Which preset I ultimately choose is going to be a choice of quality vs file size. I don't need to keep my files at 1080p...especially when it's not a specifically visually stunning movie. Right now, I'm 98% happy with the settings I'm using, and I'm not eating up my storage space like it's free.

    Specifically what we've been talking about recently is when you reach a quality and file size combo that you're happy with, how do you then target the trouble areas (e.g., low color gradient scenes), when you are already happy with everything else.

    Look....when I encode using HandBrake, I'm basically saying that I'm ok with a degree of loss in visual quality. How much is up to me. BUT....for a 90% decrease in file size (40GB to 4GB), I am defnitely NOT decreasing the visual quality by 90%. The picture is still great! It's a treadeoff.

    If I wanted all of my files at their original size and resolution, I wouldn't even be using HandBrake.
    Last edited by natebetween; 5th Feb 2015 at 10:28.
    Quote Quote  
  11. Originally Posted by natebetween View Post
    Is there a difference from encoding from a full BD rip with all of the files and BD structure, vs from the MKV you make with MakeMKV? (Do I automatically lose some quality when I use MakeMKV?)
    No. MakeMKV only remuxes the data on the disc to an MKV file. Like when taking a VHS tape out of one box and putting it in another there is no change to the tape. So unless your conversion software has a problem with the different container there is no difference.
    Quote Quote  
  12. Originally Posted by natebetween View Post
    It's about finding the sweet spot, which is clearly going to be very personal. Which preset I ultimately choose is going to be a choice of quality vs file size. I don't need to keep my files at 1080p...especially when it's not a specifically visually stunning movie. Right now, I'm 98% happy with the settings I'm using, and I'm not eating up my storage space like it's free.
    Remember, those guys here might be able to juice up everything x264 got, some ideas are valid, some test things to the point of exhaustion, to show us what those settings do, but that is always for concrete piece of video, scene, type of movie etc., if one doesn't want to test everything he encodes, there is no miracle settings. You might find something good only reliazing bitrate went up or encoding time becomes much, much longer or getting specs that hardware players might not be happy with. You set something like this,
    --crf 18 --preset slow --ref 4 --tune film --vbv-bufsize 30000 --vbv-maxrate 30000
    and then you have to have concrete scene to start variations, but are you going to do that for every movie, testing etc.?

    About that 720p downscale and then to be on a mission to get it as best it could be, that is true, it is like trying to calculate number to the 10th floating point using already rounded up number. I never really understand it also. Resizing to 720p, just using this and be done with
    Quote Quote  
  13. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by natebetween View Post
    More importantly, the storage cost is cut dramatically.
    You must be kidding yourself, storage cost nowadays is about $0.04 /GB

    Quote Quote  
  14. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by _Al_ View Post
    About that 720p downscale and then to be on a mission to get it as best it could be, that is true, it is like trying to calculate number to the 10th floating point using already rounded up number. I never really understand it also.
    I must say I am glad I am not the only one here!

    You watch, now that the resolution is cut in half there will be a new topic on how to sharpen it!

    Quote Quote  
  15. Having some bad Blu-Ray maybe 720p is a good idea, but anyway I was going at direction to spend enourmous amount of time to encode those 720p things, it seems to be quite counterproductive to give it a lots of time, that's all.
    Quote Quote  
  16. Originally Posted by _Al_ View Post
    Having some bad Blu-Ray maybe 720p is a good idea, but anyway I was going at direction to spend enourmous amount of time to encode those 720p things, it seems to be quite counterproductive to give it a lots of time, that's all.
    Well, it does take time, but it happens when I'm at work in my real life...and when I get home, it's magically done. I'm only doing a movie or 2 at a time. My collection is currently < 100 movies...but it is growing.

    Were I to keep all of my movies at 1080p, I would still encode to get a better file size...of course at the expense of a little bit of quality.

    So, my questions are aimed at those who encode to get better file sizes for small decrease in quality. If you keep your 30-50GB BD rips, more power to you. That is not me, even though I have 9TB at my disposal via my NAS (I do KEEP my original rips, at least so far. I just serve up my smaller encoded files to my movie jukebox).
    Quote Quote  
  17. I encode 720p also, and have that sitting for watching on any device, anywhere, where copy treat is only for chosen ones, like Star Trek and many more , I understand it gets out of hand to back up originals, 30GB or so.
    Quote Quote  
  18. Originally Posted by newpball View Post
    Originally Posted by natebetween View Post
    More importantly, the storage cost is cut dramatically.
    You must be kidding yourself, storage cost nowadays is about $0.04 /GB

    Well..."cost" in my original quote was used to refer to the GB cost...not the $$ cost.

    But, since you brought it up....a 5 GB file costs me 20 cents...but a 50GB file costs me $2. Multiplied by 100 movies....$20 vs $200 (and that's only for my SMALL movie collection).

    I don't know about you, but I have good uses for $180 other than digital storage! Like a date weekend with the wife. Or more movies!
    Quote Quote  
  19. Originally Posted by newpball View Post
    I must say that I am shocked, people talking here about 'improving' the results by playing endlessly with codec options while at the same time they have no issue cutting the resolution of the video in half.

    A case of not seeing the forest from the trees!

    Stunning!
    I don't think anyone who's encoded more than a couple of Bluray videos would argue they all have 1080p worth of picture detail. I posted a link to a "downscale upscale" test earlier that shows just because a video is 1080p, it doesn't mean it has 1080p worth of detail. And of course none of those codec options we're playing with encode individual pixels. They encode mathematical approximations of sections of the picture.
    It makes sense to me you should be able to reduce the resolution, and if it doesn't reduce the picture detail (or only reduces it by a tiny amount) what's left should be encoded more accurately for a given bitrate.

    I don't just downsize to 720p willy-nilly. I test different resolutions before I encode. However I've encoded enough video to know assuming 1080p always looks better than 720p would be pretty silly. And even if it does, the difference between 1080p and 720p is usually fairly small, and for the average person using an average size TV back at average viewing distance there'd probably be none at all.

    Maybe you should try encoding a few videos yourself and return when you know what you're talking about.
    Last edited by hello_hello; 5th Feb 2015 at 13:07.
    Quote Quote  
  20. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by hello_hello View Post
    I don't think anyone who's encoded more than a couple of Bluray videos would argue they all have 1080p worth of picture detail.
    Well yeah, don't get me started on the brilliant ability for engineers to produce botched telecines. But reducing the resolution of such botched telicines by 50% is hardly the solution.

    Originally Posted by hello_hello View Post
    And even if it does, the difference between 1080p and 720p is fairly small,....
    1080p has double the amount of information over 720p!

    Fairly small?
    That's like saying the difference between $200 and $100 is fairly small.

    Originally Posted by hello_hello View Post
    and for the average person using an average size TV back at average viewing distance there'd probably be none at all.
    Don't underestimate the average person, they may sit closer to 'the box' than you might think!



    Last edited by newpball; 5th Feb 2015 at 13:17.
    Quote Quote  
  21. Originally Posted by natebetween View Post
    Also, do you find it worthwhile to use the x264 tune options for film and animation?

    Ok...that's 3 questions. Thanks.
    Tune film tends to retain a little more fine detail. I use it pretty much all the time. As with any option that increases the amount of detail encoded, the bitrate will also increase a little as a result. Tune film changes deblock from --deblock 0:0 to --deblock -1:-1 and psy from --psy-rd 1.0:0.0 to --psy-rd 1.0:0.14 (increases psy trellis strength).

    Like most tweaks, the lower the CRF value the less vidual difference there'll probably be.

    Tune animation does the opposite to a certain extent, It's for encoding animation with large flat areas of colour and not much fine detail. It does seem to increase compression but you'd only use it on Simpsons type animation. It increases the number of B and Reference frames, changes deblock to --deblock 1:1 and reduces psy RD strength --psy-rd 0.40:0.00. I think that's it.

    For a list of what those settings do. x264 settings
    Quote Quote  
  22. Originally Posted by newpball View Post
    Originally Posted by hello_hello View Post
    I don't think anyone who's encoded more than a couple of Bluray videos would argue they all have 1080p worth of picture detail.
    Well yeah, don't get me started on the brilliant ability for engineers to produce botched telecines. But reducing the resolution of such botched telicines by 50% is hardly the solution.
    I'd much prefer not to get you started on anything.

    Originally Posted by newpball View Post
    Originally Posted by hello_hello View Post
    And even if it does, the difference between 1080p and 720p is fairly small,....
    1080p has double the amount of information over 720p!

    Fairly small?
    That's like saying the difference between $200 and $100 is fairly small.
    Excellent, so when I take a 720p encode and upscale it to 1080p on playback I'm doubling the information and I'm back where I started.
    Obviously the idea that just because something has a 1080p resolution it doesn't necessarily mean it contains 1080p worth of picture detail is too much for you.
    Obviously the concept of encoders not encoding individual pixels and throwing detail away in the process doesn't fit into the "1080p must be better" reality you've created for yourself.

    Originally Posted by newpball View Post
    Originally Posted by hello_hello View Post
    and for the average person using an average size TV back at average viewing distance there'd probably be none at all.
    Don't underestimate the average person, they may sit closer to 'the box' than you might think!
    Oh god..... another idiotic picture followed by a smiley in lieu of a valid argument. Instead of looking at my "downscale upscale" example and explaining why it's wrong you reach for the collection of silly pics you post in forums as your argument self-combusts. There's a surprise.
    Quote Quote  
  23. Originally Posted by jagabo View Post
    You need a dark noisy shot with shallow gradients. It's also easiest to see on standard definition material since it gets enlarge more when you play it full screen. Here's a sample AVI with UT video codec (BicubicResize() from a Blu-ray source). Even at the medium preset at CRF 18 you'll see obvious posterization artifacts. The effect of aq-mode=3 on reducing that posterization isn't huge.
    I had a play and I see what you mean, however in respect to "noticeable loss of detail in dark areas".......

    If you increase the gamma or brightness and then re-encode it, I'm not sure the result is any different. The posterization artefacts are still there. As the brightness increases they may get harder to see, or they're less obvious, but cranking up the brightness doesn't cause the encoder to "retain more detail" as such.
    If anything it indicates to me there's always going to be some video that's particularly hard to encode and when it's darker the artefacts might be more noticeable, but it hasn't convinced me to sign up for a membership of the "noticeable loss of details in dark areas" club just yet.
    Quote Quote  
  24. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by hello_hello View Post
    Obviously the idea that just because something has a 1080p resolution it doesn't necessarily mean it contains 1080p worth of picture detail is too much for you.
    It's not, but I wonder, did you think through the consequences?

    For instance, if we suppose that 50% of a 1080p is garbage then reducing the resolution with 50% still makes it twice as bad, the scaling algorithm cannot distinguish.

    Frankly your argument makes no sense at all. By reducing the resolution by 50% you always make things worse.

    Quote Quote  
  25. Originally Posted by newpball View Post
    Originally Posted by hello_hello View Post
    Obviously the idea that just because something has a 1080p resolution it doesn't necessarily mean it contains 1080p worth of picture detail is too much for you.
    It's not, but I wonder, did you think through the consequences?

    For instance, if we suppose that 50% of a 1080p is garbage then reducing the resolution with 50% still makes it twice as bad, the scaling algorithm cannot distinguish.

    Frankly your argument makes no sense at all. By reducing the resolution by 50% you always make things worse.
    So taking a 720 picture with 50% garbage and upscaling it to 1080p must therefore make it half as bad, using whatever illogical logic you're offering.
    If my argument made no sense at all you'd have no trouble countering any of the points I made rather than offer a meaningless generalisation, or maybe you'd have looked at the example I posted which you keep ignoring, presumably because it contradicts your theory, and explained why downscaling and up-scaling made things worse.

    If I take a 720p picture and duplicate each pixel horizontally and then again vertically I get a picture with lots more pixels. Quadruple the resolution if I'm not mistaken, but exactly the same amount of picture detail. If I then reduce the resolution to 720p I'm back where I started, but how have I lost picture detail in the process?
    I don't know which part of a 1080p image not necessarily containing 1080p worth of picture detail you're unable to understand. Sure, maybe each and every time you resize a 1080p video down to 720p "something" is lost, but that doesn't necessarily mean it's picture detail you can see. The source video doesn't have each and every pixel encoded. It's compressed.

    I don't need to argue about it. I've compared the two countless times. I've posted a screenshot as an example. I know sometimes it's possible to downscale 1080p to 720p and not loose anything in respect to visible detail. I know sometimes you can't. I know even if there's a difference it's generally not all that huge and I know at the same bitrate you can encode 720p at a higher quality than 1080p. I'd rather watch high quality 720p than low quality 1080p because any loss in picture detail due to resizing down is usually far less noticeable than a loss of detail due to compression or an increase in compression artefacts. It's not all about the resolution.
    Last edited by hello_hello; 5th Feb 2015 at 15:40.
    Quote Quote  
  26. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by hello_hello View Post
    ...an increase in compression artefacts. It's not all about the resolution.
    I presumed that we would agree that a good compressed video should contain no visible compression artifacts.

    Perhaps I was wrong!

    If your standards are such that visible artifacts are acceptable then obviously the resolution argument goes straight down the drain.

    I think that nothing more annoying for the next generation will be this generation's idiotic acceptance of totally unnecessary compression artifacts in videos. They will understand it for the prior generations, there were technological limits but for this generation? This generation were we can buy a 3TB drive for under $120 and 64GB sticks are commonplace?
    Quote Quote  
  27. Originally Posted by newpball View Post
    Originally Posted by hello_hello View Post
    ...an increase in compression artefacts. It's not all about the resolution.
    I presumed that we would agree that a good compressed video should contain no visible compression artifacts.

    Perhaps I was wrong!
    FFS! I said you can encode 720p at a higher quality than 1080p at the same bitrate. If at a given bitrate the 1080p version exhibited compression artefacts I'd rather watch the 720p version. How on earth does that translate into my saying good compression would include compression artefacts?

    Originally Posted by newpball View Post
    If your standards are such that visible artifacts are acceptable then obviously the resolution argument goes straight down the drain.
    I give up.
    Not to mention your resolution argument obtained zero credibility the first time you ignored the example I posted and has maintained zero credibility ever since.

    Originally Posted by newpball View Post
    I think that nothing more annoying for the next generation will be this generation's idiotic acceptance of totally unnecessary compression artifacts in videos. They will understand it for the prior generations, there were technological limits but for this generation? This generation were we can buy a 3TB drive for under $120 and 64GB sticks are commonplace?
    Oh..... that's why there's so many threads such as these, because we're all happy to accept totally unnecessary compression artefacts and enjoy discussing our acceptance of them. Is that what we're doing here or are you talking complete nonsense? The next generation is going to be furious because I re-encoded my Bluray at 720p? Really? Will they be buying overly compressed video as a result?

    It doesn't matter how many times you repeat the cost of hard drives there's always reasons for wanting to reduce the size, and/or compressing more efficiently etc. If that wasn't the case we'd all be encoding with mpeg2 and there'd be no need for h265.
    Quote Quote  
  28. Banned
    Join Date
    Oct 2014
    Location
    Northern California
    Search PM
    Originally Posted by hello_hello View Post
    Oh..... that's why there's so many threads such as these, because we're all happy to accept totally unnecessary compression artefacts and enjoy discussing our acceptance of them.
    Well I do not want to put things too black a white but when a person comes along and wants "high quality" and "compress the life out of it" would not a brutally honest straightforward response be better than "try to tinker with option XYZ"?

    Originally Posted by hello_hello View Post
    It doesn't matter how many times you repeat the cost of hard drives there's always reasons for wanting to reduce the size, and/or compressing more efficiently etc. If that wasn't the case we'd all be encoding with mpeg2 and there'd be no need for h265.
    Aiming to reduce the size by getting more efficient codecs is a good thing, aiming to reduce the size by sacrificing quality merely to reduce storage space is absurd.

    Unless there are valid reasons, and yes valid reasons do exist, for instance when video is streamed with limited bandwidth or when the bitrate is too high for the storage media to provide real time fps, it is ridiculous to sacrifice quality.

    Quote Quote  
  29. Originally Posted by newpball View Post
    Well I do not want to put things too black a white but when a person comes along and wants "high quality" and "compress the life out of it" would not a brutally honest straightforward response be better than "try to tinker with option XYZ"?
    Maybe if threads such as these weren't interrupted with irrelevant nonsense it'd be easier to keep track of what was said.

    Originally Posted by hello_hello View Post
    I don't play around with x264's settings much at all myself. At least not for my usual encoding. About the only time I experiment is when a thread like this prompts me to do so, but I invariably go back to picking a tuning a CRF value most of the time.
    When a person comes along and wants high quality and to compress the life out of it I'm not sure how a statement of the obvious or a report on the current state of hard drive prices contributes to the discussion.
    You can buy more hard drives and choose not to re-compress? Really? Well I never. Who'd have guessed.....
    And I certainly don't see how advising to resize to 720p isn't consistent with a "high quality and compress the life out of it" goal, so I don't really know what point you're trying to make. Would the quality be higher at the same bitrate for 1080p?

    Originally Posted by newpball View Post
    Unless there are valid reasons, and yes valid reasons do exist, for instance when video is streamed with limited bandwidth or when the bitrate is too high for the storage media to provide real time fps, it is ridiculous to sacrifice quality.
    Well I'm pretty much done bothering with another round of your selective arguing. I've posted an example of downscaling and claimed it shows you're not always sacrificing quality by doing so, and you've ignored it again and again. I've also said I don't just downscale to 720p willy-nilly, but only when doing so doesn't result in a noticeable loss of picture detail.
    You're the one insisting that resizing down to 720p always sacrifices quality, you've done nothing to prove that's the case, made no attempt to show why my example is wrong, and therefore I reject your "sacrificing quality' premise due to it's irrelevance and lack of credibility.
    Last edited by hello_hello; 5th Feb 2015 at 16:58.
    Quote Quote  
  30. newball
    How can you stubbornly keep telling people what they should do with their video ?

    What is the purpose of encoding? Make video smaller or make it ready for a device. Whats wrong with that?
    Quote Quote