I just ran into an unfortunate, but interesting, problem. I have encoded tens and tens of files with basically the same settings but for some reason only NOW did I notice this particular problem.
I have been encoding to x265, video from a full HD h264 stream (at ca. 8000 kbps), but because I know for certain the actual source material is from (old) analog tape I have simply resized it to 960x540, which also conveniently eliminates interlacing from the equation. I have also set nodeblock=1 , more on this below.
And the problem is, that I now noticed that while there is absolutely no visible blocking in the h264 source... there ARE in the resulting encode. There are MASSIVE block artefacts at some points in at least one of the resulting encodes.
I had originally set the nodeblock=1 because it was my understanding that because of the very high quality (excessive in terms of bitrate vs. actual resolution) of the source, the video would not need to be "deblocked" because there is nothing TO deblock, and also that it has already BEEN deblocked when it was encoded to h264 (correct?) and in transcoding, there would not be a need to do that again. Also I was concerned that the "deblocking" would in practice mean "blur" and I purposely did not want to lose detail unnecessarily.
Also, I had done somewhat extensive, subjective tests much before this, with the same exact type of source, with the deblocking set off, and I did NOT see the artefacts I'm seeing now. There's a reason I used basically the same settings for tens and tens of videos before this -- there WERE NO issues like this, before this.
But why does this blocking occur? I originally saw it on what was ACTUAL Full HD content and eventually I gave up. It seemed like no matter what, if SAO was turned on, then deblock had to be turned on as well or otherwise the result was these inexplicable massive blocks of which there was no trace in the source. The blocking appear more in scenes with a lot of motion but it does occur also in almost completely static scenes.
What does the deblock filter do and why is it sometimes somehow involved these catastrophic consequences - and other times it or the lack of it is just fine?
Using x265 encoder (through ffmpeg) version 3.3 on Windows.
+ Reply to Thread
Results 1 to 9 of 9
Something that just crossed my mind (although the question is still valid)...
Is it at all possible that some kind of processing fault could cause this sort of specific problem? I realised that at the moment, my CPU and memory are overclocked or more accurately, they're slightly more overclocked than they were previously. However, with the current settings, I got several valid Pi computation results, and kept the component stress tester of y-cruncher running for a full hour. My system uptime right now is almost 4 days.
I say specific problem, as during this time I have encoded many, many other videos (some of them several times), with the files also of course including audio. In viewing that those files turned out ok I've skimmed through them and have not noticed ANY obvious errors, glitches in the audio or any indication of any problem which is why I didn't think of this at all until now.
Is it even POSSIBLE that the x265 encoder would "successfully" process a 3 hour video file, ffmpeg would successfully produce a valid MKV file with perfect audio etc. BUT there would be these specific, weird errors only in the HEVC-encoded video? Does x265 not perform ANY sort of error checking while it is running or does it assume absolutely all computations made are 100% reliable as they are? (as a sidenote, if something like that is completely missing, it's a bit... odd. This would obviously quite often cause people to produce encodes that have blunt errors in them, and never be aware of it).
I've been running the y-cruncher stress tester all the while here while typing (it also saturates all CPU cores 100% at the same time, something that no real-life application ever does) and still no problem after over 30 minutes. For those who are unaware, y-cruncher is a program that also would immediately STOP executing if it detected that something in the computation went wrong. At least two of the tests are specifically for AVX2, which I understand is something x265 uses heavily or almost exclusively.
I would think this being caused by overclocking is highly, highly unlikely... but I'm asking whether this could be at all possible, since regardless of other things, I am NOT an expert when it comes to x265 or video encoding in general. In other cases that I know, overclocking would cause BSOD's, it would cause individual applications to crash, it would cause immediately apparent visual errors everywhere, all the time. But it would NOT generally cause things to appear in every way to be perfectly fine except that one particular application would produce incorrect results. That's why I feel this is unlikely. I feel there would be OTHER, more easily detected errors that would NOT have gone undetected. Your thoughts?
Having said all that, the problem applies regardless, particularly what I said concerning deblock in relation to SAO, towards the end of my previous post. That was something that I noticed a much longer time ago, regardless of CPU's or their clocks, possibly as long as 4 years ago and I noticed it consistently in my sporadic video encoding endeavours. SAO in general could maybe warrant a thread of it's own. But I'm mentioning it here as it seems to be very much somehow linked specifically to deblock. Mostly: if deblock is off, SAO very often messes things up. If BOTH are on (or off, I suppose), the result is always fine.
One more thing... I did now notice this type of blocking had occurred in previous encodes but it was (for whatever reason) much more subtle, to the point that just skimming through, watching 30 secs here, 30 secs there I did not realise it was there. In contrast, in two of the later encodes the quality is simply so horrible that it's difficult NOT to see it.
The small changes over time I've started including are the options constrained-intra and no-strong-intra-smoothing. I had read they should be beneficial for what I'm trying to do (transcoding, minimal subjective quality loss, in the direction of simply staying close to the original, said another way), and I used them first on Full HD resolution film encodes with good results.
But here's the catch. I've done at least 30 encodes of those HD resolution films WITH those settings and NONE of them exhibit the blocking effect that's going on here. None of them exhibit any kind of problem that I can see, visually they're comparable to any other encode. Like I said, I had simply read those options theoretically would be beneficial in my case, I saw no problem so I kept on using them. This then floated over to all other encodes eventually, as I was largely on the same command console and just copy-pasted different filenames and did minor tweaks like mapping some additional stream etc. These cases where I've quartered the resolution are no exception in other ways, as they also are transcodes etc.
Could, and if so, WHY would those two options cause what is in some instances such, almost unbelievable, amount of quality deterioration... but ONLY in the encodes where the video was resized before encoding? I'm about to test without those options... but unfortunately for some of these files it doesn't matter as the original is now lost. The encodes still appear to be somewhat watchable but markedly poorer in quality than the previous ones. Same bitrate, same QP range, average QP, all that... just with much more blocking.
I have no clue as to why either of those options would cause this. If it indeed was them, how? They're not supposed to have anything to do with resolution!
Miles of text and yet no screenshots of problem in question, no mention of settings and bitrate used, no samples of source and problematic video. Don't mind me, carry on with your blog post.
I tried encoding one clip again without constrained-intra but otherwise the same settings. The result still has the horrible blocking. So that's not it either.
P. S. This forum is called "Newbie / General discussions". What can you expect?
A sample of your source, that sample re-encoded showing the blocky artifacts, the exact encoding software and settings you used.
Try somewhere around preset slow, tune grain, crf 18; compare with/without deblocking. Still "horrible" blocking?
Last edited by jagabo; 19th Jan 2021 at 14:40.
I realise a couple of screenshots would help, obviously, to see what kind of artifacts they are. What I meant to badyu17 was that there was no need for a passive-aggressive jab like that, if you'd want me to give you screenshots or some more details... just ask me for some more details.
I'm not sure using crf 18 with the slow preset, in this case, would actually help in finding why the blocking did happen... because crf 18 would probably just cause the bitrate to skyrocket, and with that much more bits... probably straightforward more quality.
The latest encode I did without no-strong-intra-smoothing and the blocking doesn't seem to be there.
(I also built ffmpeg again between those encodes, so I'm not on 3.3 anymore)
x265 [info]: HEVC encoder version 3.4+30-23583c4cb
-pix_fmt yuv420p10 -max_muxing_queue_size 9999 -vf "phase, scale=960x540:flags=none" -c:v libx265 -preset medium -x265-params "min-keyint=22:keyint=230:ctu=32:merange=24:aq-mode=3:deblock=-2,-1ools=8" -b:v 700k -bufsize 3600k -maxrate 900k
I found the tip to not completely eliminate deblock, but reduce the strength of it on some other forum. I'm not completely sure yet whether that's the best setting in this scenario, I'll have to actually watch the whole thing. If you're wondering about the bitrate, I estimated it based on the info that the original source must have been on a Betacam or similar tape, so the actual "resolution" is something like less than 720x480 (and was it around 300 by something for chroma?). With x265 ABR around 600-700, I'm getting average QP's of around 25 which has somewhat become my goal for "very watchable" in objective terms. But in this case, objective and subjective did not agree...
Anyway, you use medium preset, average bitrate as rate control and even limit rate control with that low VBV settings? What do you expect? Try using CRF without -bufsize and -maxrate options.
... There's no reason to be upset. Not because of what I said here anyway. Seriously.
However there IS a reason I use ABR here, that's a conscious decision I've had to make. Concerning "what I expect" in terms of quality (?), something fair at the least. The quality should be pretty good with the later and current versions of x265. 600-900 kbps for video what's essentially less than 720x480 in resolution... is there a particular reason you think that would not be enough? It's the equivalent of using up to 3600 kbps for 1440x1080 HD content, and that to me is very close to if not completely transparent.
Besides, there was nothing to show the blocking has anything specifically to do with the bitrate (there was only an assumption... MY assumption, so you might not trust it.) I should maybe repeat from my previous post: "The latest encode I did without no-strong-intra-smoothing and the blocking doesn't seem to be there." and also, all the previous encodes without that particular setting also do not show similar blocking. There might have been "miles of text", absolutely, but I thought what I actually said was at least clear.
The reason I don't use CRF is a whole separate problem, I could explain why in detail but it should probably be in it's own thread. Put VERY shortly: it's unreliable.
Last edited by non-vol; 24th Jan 2021 at 11:00.