I reassembled everything again and the results were the same. The file sizes are different, but the picture is the same. By the way, this is another strange thing when the video is the same, but the file size may differ by bytes after encoding.
+ Reply to Thread
Results 61 to 90 of 92
-
-
This is not supposed to happen. Encoder is deterministic - same results every time. It should be bit for bit identical
You have to enable non-deterministic on purpose , for results to vary. This switch is "off" by default
Code:--non-deterministic Slightly improve quality of SMP, at the cost of repeatability
On the script side, there are some filters that add "randomness" on purpose - some types of noise generation and dithering filters for example . There can be a "random" seed and encoding same script may produce very slightly different results -
side note: container might not be deterministic (iirc. at least mkvtoolnix adds a random number)
users currently on my ignore list: deadrats, Stears555, marcorocchini -
I was already asked on the first page about overclocking and overheating of the computer. No, not even close to overheating. The processor is not overclocked. Perhaps there is only something wrong with the motherboard; it is no longer new. Could the motherboard be at fault? The container and codec are standard from the FFMPEG 6.0 package. Clarification: in the video file properties, the file size is different, but the disk space is the same.
System parameters.
Intel Xeon CPU E3-1230 V2
Gigabyte H77-DS3H motherboard
DDR 8 Gb
Windows 10 Home 22H2 -
The container and codec are standard from the FFMPEG 6.0 package. Clarification: in the video file properties, the file size is different, but the disk space is the same.
How much of a difference are we talking about? What container is used?
Like I wrote before depending on the settings some containers will have some random numbers in them, so even if you just remux a file, it will not give the exact same output.
(the extracted streams would still be identical)
Since this whole thread does not contain full details about the used calls, tools, input an output I which everyone much fun doing the 'wild guessing' game.
Cu Selur
Ps.: if you use gpu filters also make sure the gpu isn't overclocked. I once spend a few weeks searching with a Hybrid user why he got random artifacts until we realized his gpu was factory overclocked and after down clocking to the manufacturer stock values the issues disappeared.users currently on my ignore list: deadrats, Stears555, marcorocchini -
Yes this is true. I remembered, I have a video card that is overclocked at the factory. The so-called Black Edition. Full name:
XFX AMD RX570 4 GB Black Edition.
KNLmeans uses GPU.
I have MSI Afterburner installed, which can reduce the frequency. I need to understand which parameters to reduce and what value to set.
Last edited by gelo333; 16th Nov 2023 at 10:22.
-
I'm using an MP4 container.
Codec Libx264. I tried x264 mod by Patman in the Staxrip application, the results were the same. I can also use libx265. -
-
-
try whether lowering the 1286 to 1150 MHz Core Clock speed changes anything.
users currently on my ignore list: deadrats, Stears555, marcorocchini -
There is no change. I'm not even sure that there is a reduction in encoding speed. But the frequency is exactly 1150, this can be seen in GPU-Z.
-
Then your problem might not be due to an overclocked gpu.
users currently on my ignore list: deadrats, Stears555, marcorocchini -
Another attempt to show that there is no relationship between sharpening and the presence of obvious stripes.
Original. Pay attention to the chin, neck, wrist.
chin - defects are barely noticeable
neck - no defects
wrist - defects are barely noticeable
Settings FineSharp(mode=1, sstr=2.0, cstr=0.2, xstr=0, lstr=1.8, pstr=7, ldmp=0.1) and I will increase "sstr" from 2.0 to 2.5.
sstr=2.0
chin - defects are barely noticeable
neck - no defects
wrist - no defects
sstr=2.1
chin - no defects
neck - no defects
wrist - defects are barely noticeable
sstr=2.2
chin - defects are visible
neck - defects are visible
wrist - defects are visible
sstr=2.3
chin - defects are visible
neck - no defects
wrist - defects are barely noticeable
sstr=2.4
chin - defects are barely noticeable
neck - no defects
wrist - defects are barely noticeable
sstr=2.5
chin - no defects
neck - no defects
wrist - defects are visible
Last edited by gelo333; 22nd Nov 2023 at 05:15.
-
You mean after qp 26 encoding. There is some expected "randomness" as the result of low bitrate lossy encoding, this was discussed earlier. See post#2 if you've forgotten. Check other frames within the same encode, and between sstr encodes - and you will deviations as well. The higher the bitrate, the higher the quality of encoding settings, closer the IPB ratios => the more similar to the source, the more predictable the results. I get slightly different results than you again at qp26 , but there is some "randomness" and inconsistency that you would expect at qp26. eg. qp 10 less "random". qp 0 not "random" at all - all the stripes and defects remain
Increasing sstr increases the severity of sharpening artifacts that you send to the encoder - and that relationship is very predictable. Try qp 0, it's 100% definitive proof that increasing sstr 0 to 1, to 2, to 2.2, to 2.5, to 3 etc.. .... increases the artifacts . qp 0 eliminates all the "randomness" from low bitrate lossy encoding . You get exactly what you send.
This is the script output with increasing sstr . Cropped to region of interest and nearest neighbor enlargement x4 . As you can see you're sending significant oversharpening artifacts ("stripes") and "hoping" that the encoder deals with them appropriately. It's just common sense - stronger sharpening leads to stronger oversharpening artifacts. Not a conventional compression strategy...Most people wouldn't send oversharpened "garbage" to the encoder in the first place - i.e. fix your script . See post #58 again.
What you're calling "defects" or "stripes" are actually "details" to the encoder. When there are no "stripes" in the lossy result, it's actually a compression artifact because it's different than the input. There is not enough bitrate to preserve all the stripes. Why send "stripes" and garbage in the first place ? Just don't sharpen so much
You have additional problems - Since you get "random" results each time in the first place - aka. non deterministic (different filesizes using same script settings, same encode settings) - How do you know it's not the "randomness" of your computer results the you've been getting before instead of the sstr ? Another way to put it is encode the same script , same encode settings 10 times, if you get different results every time, you cannot draw any solid conclusions, because there are some other factor(s) contaminating your results . -
Another way to put it is encode the same script , same encode settings 10 times
I, of course, used the preview in the AvsPmod and Staxrip applications. The results after encoding are not consistent with the preview.
As you can see you're sending significant oversharpening artifacts ("stripes") and "hoping" that the encoder deals with them appropriately. It's just common sense - stronger sharpening leads to stronger oversharpening artifacts.Last edited by gelo333; 22nd Nov 2023 at 15:58.
-
Not after lossy encoding, that's what "lossy" implies - quality loss. If the results after encoding don't look 100% identical after qp 0 encoding, there is something wrong . The lower the bitrate, the lower the quality encoding settings - the more deviation from the preview
Instead of preserving the stripes (because that's what your script is sending) , lossy encoding only preserves some of the "stripes" on some frames, on some parts of frames. Removing the "stripes" in your case is actually an encoding artifact
I expect KNLmeans to remove these stripes.
Code:LSmashVideoSource("HHH.mp4") Spline36ResizeMT(480, 270) KNLMeansCL(D=1, A=1, h=2, device_type="auto") FineSharp(mode=1, sstr=x, cstr=0.2, xstr=0, lstr=1.8, pstr=7, ldmp=0.1)
-
Removing the "stripes" in your case is actually an encoding artifact
At sstr=2.2, stripes appeared on the neck. They are not in the original. They are not present in other sstr values. Why was this artifact taken? Or why did all the other videos show artifactual blur?
"sstr" values 2.0 - 2.5 are not oversharp. This is the average. Moreover, before this, a denoiser is used.Last edited by gelo333; 22nd Nov 2023 at 16:30.
-
Use higher bitrate and the stripes will appear all over - similar to the apng
At sstr=2.2, stripes appeared on the neck. They are not in the original. They are not present in other sstr values. Why was this artifact taken? Or why did all the other videos show artifactual blur?
But if you check other frames, there are other "stripes" in other parts of frames , and in different frames. The "2.2" does not matter. They occur at 2, 2.2. 2.5 etc... If not that current frame, then other frames - that's the part of "randomness" - the bitrate used is not sufficient, so it cannot preserve all the "stripes" (to the encoder, those sharp edges are "details", not artifacts). Only some are preserved, others are blurred. If you used adequate bitrate, there would be stripes everywhere - similar to the input source (the script) . Encode it at QP 20, 15, 10, 5 .... etc... and you can see the progression of stripes increasing - because encoder is preserving more detail
It's analogous to encoding grainy source (forget sharpening for a second). At low bitrates, encoder cannot preserve all the grain, so you get splotchy almost "random" sparse grain pattern. Parts look blurred, but parts look like grain . The higher bitrate you use, the more grain returns like the source, more even instead of splotchy or random
"sstr" values 2.0 - 2.5 are not oversharp. This is the average. Moreover, before this, a denoiser is used.
That's just the chin on that single frame, almost every other frame has other types of oversharpened artifacts as well . Direct correlation with sstr. I can highlight some problems if you like - you can see the problems definitely worsen at sstr 2. -
So, we can come to the conclusion that randomness still happens in the work of a coder? Is there any way to avoid this? I don’t think that for a 480x270 video, 800 bitrate is not enough.
weightp, weightb and mixed references can affect this? I don't really understand yet.Last edited by gelo333; 22nd Nov 2023 at 18:12.
-
It's not really "random" per se - encoder has fixed logic - but it may appear "random" in a generic sense - this was discussed earlier if you've forgotten look back a few pages. It's the nature of lossy encoding
=> Remember you encoded qp 0, and it was completely consistent, but didn't "look as good". It doesn't "look as good" because what you're sending is not good in the first place - it's full of "stripes" and other artifacts from oversharpening.
See post #22 about qp 0 : "But there is no unpredictability, the changes have become logical, yes. " . The higher the bitrate, the better the encoding settings, the more predictable. At qp 0 it is 100% predictable, you get same as the input. No variation at all. Input=Output. qp 1 would yield more 99.99% predictable results, more than say qp 10. But you would expect significantly more quality fluctuation problems and unpredictabililty at qp 26 - that is the nature of lossy encoding.
Is there any way to avoid this? I don’t think that for a 480x270 video, 800 bitrate is not enough.
"enough bitrate" is a relative statement and depends on situation, source, context.
The lower the bitrate: the more inconsistencies, the more fluctuations, and more quality issues
weightp, weightb and mixed references can affect this? I don't really understand yet.
There are other types of artifacts that are enhanced in the script, sent to the encoder, such as typical ringiing and edge artifacts
There are several other types and examples I can post too - they are all enhanced by sstr
The point is you are sending oversharpen artifacts to the encoder. What do you expect it to do ? Again, sharpening is counterproductive for low bitrate encoding - it requires more bitrate (larger filesizes), and is prone to generate artifacts . If you didn't sharpen so much, the "stripes" and other artifacts would not look as bad.
An encoder's primary job is to attempt to reproduce input , not to remove "stripes" or other artifacts. When you send "stripes" (or any "artifact") as the input - it tries to reproduce them . At low bitrates, it succeeds in some areas, on some frames (you see stripes), fails on others (blurred) . In this case, blurred might be preferred, but it's a deviation from the input - so "blurred" is actually an artifact for your script. Encoders currently are not "smart" enough to tell what is a "detail" or what is an "artifact". It thinks the "stripes" that you send are "details" . At qp 0, it 100% reproduces all the artifacts and details you send. A big reason why people denoise/degrain for low bitrate encoding - to make it "easier" for the encoder . But when you sharpen, you make it more difficult for the encoder - sharpening is counterproductive if you goal was to reduce filesize
If you send a red frame and get a purple frame as result, that purple frame is an "artifact". Even if you "like" the purple frame better, it's a deviation from the input red frame. A perfect result would be the same red frame - it would be PNSR infinity. You could argue that if you wanted a purple frame , then why not send a purple frame in the first place? It's the same thing here - if you didn't want "stripes", then why are you sending "stripes" and other artifacts to the encoder ??? Encoder is going to try to reproduce stripes (and it does successfully on some parts of some frames) . The insufficient bitrate results in the "randomness" of the reproduction . If you test higher bitrates as I suggested, CQ 10, 5, 1 etc... it gets closer approximation the input, much more predictable - many more "stripes" and artifacts are reproduced - because that's what you sent to the encoder
Not sure, because you had non deterministic results prior and could not track down the problem. If you do it 10 times, and get the same results 10 times, then there might be a difference in the faster preset. But maybe you do it 2 times on medium but now it's different the 2nd time (shouldn't be, but you have some other issues on your system)
You can also track down the core version and revision, look at the changelog . There might have been material changes that resulted in preset differences between versions -
if you didn't want "stripes", then why are you sending "stripes" and other artifacts to the encoder ???
This is the main point of my problem. -
There is another problem that I mentioned in passing earlier. I can't find any examples yet. In short, the problem looks like this.
original video
101 frames - worst image
102 frames - improvement
103 frames - even greater improvement
104 frames - good
one way to configure filters
101 - worst
102 - worst
103 - worst
104 - worst
second way
101 - worst image
102 - improvement
103 - even greater improvement
104 - good
everything is framed almost the same as in the original.
This probably depends on the arrangement of IBP frame types. How to control this? I only found a dependency on "keyint" and "keyint_min". Enlarging them improves the image. I use keyint=10/keyint_min=1 or 20/1 or 30/1 or 30/5. The problems (all of which I talked about in this thread) are most visible at 30/5. -
This was already answered
Your main problem with consistency is inadequate bitrate .
Oversharpening requires even more bitrate , otherwise it results in even more inconsistencies and other problems
See the previous post it explains why you are getting those results. Did you try it ? QP 0 = 100% consistent for the encoder; QP 26 = inconsistent. In between is in between consistent.
And your other problem is your script and source. There are unfiltered inconsistencies sent to the encoder as well. For example the cheek keyframe pop I mentioned earlier , things like that. Frame fluctation quality in the source from low bitrate
This was already addressed earlier
Everything relates to inadequate bitrate . In your source and/or your encode.
You're not going to be able to fix this at lower bitrate ranges. This is the nature of temporal lossy encoding.
So when the source already has fluctuations, you have to filter it if you want more consistent quality (the problem was low bitrate for your source in the first place).
But even if you fix it with proper filters (pretend you have a perfect source now, or another perfect source), if you encode with low bitrates, you will generate new consistency problems - just like the source you have now. Do you see the problem? They all come from inadequate bitrate .
Some things you can experiment with, but none will help very much, unless you have adequate bitrate:
For example, you can use all I frames. No P, B . This should have more eveness to the quality, but the compression ratio will be poor - you'd need probably 2-3x the bitate for a similar level of quality
It was mentioned earlier - You can override "auto" control using a --qpfile to manually specify parameters per frame for full control; not only frametype, but quantizer too. Another method of control is --zones - you can specify different settings for ranges or frames, or a bitrate multiplier for example, this gives you more manual control if you don't like what encoder is doing.
Larger keint means a longer GOP , before a new keyframe is "forced" for a new GOP. At low bitrates, longer GOP can help efficiency (or higher quality at a given bitrate), because you have more frequent lower cost p and b frames, instead of "expensive" I frames. The default keint is 250 . For low bitrate encoding, you generally want lots of b-frames, longer keyframe intervals because it helps with efficiency. There are diminishing returns for keint length
For higher bitrate encoding, you generally want shorter keyframe interval. E.g. film BD's use 24. 1 sec interval, and they have adequate bitrate
As mentioned earlier, you can experiment with adjusting the IP, PB ratios for more even distribution of bitrate between frame types (in general, you want closer to 1 for higher bitrates, because you can afford it; for exmple, many BD's are done with 1.1 and 1.1 instead of 1.4 and 1.3 - but you generally can't afford it with low bitrates, you want lower cost b-frames), or adjust max number of consecutive b-frames (0 to 16, in generate you want more for lower bitrates) -
As a side note: sometimes it helps to look at a source using Retinex to see where are 'hidden' artifacts/details which might 'popup' through filtering (or should be dealt with, through filtering).
users currently on my ignore list: deadrats, Stears555, marcorocchini -
Oversharpening requires even more bitrate , otherwise it results in even more inconsistencies and other problems
-
I would accept a moderate presence of these artifacts. The main thing is to understand what the encoder will give me at the output. Now I slightly change one setting in the filter, but the output result is very different, not at all logical. I can increase the sharpness, but the stripes, on the contrary, blur. Or vice versa, I reduce the sharpness, and the stripes become sharper. I reduce it even more, and the stripes disappear again.
-
For example, a post where I showed pictures of changes in sharpness settings. If we accept that blur is an artifact, then the conclusion is:
artifacts 2
middle artifacts 2
no artifacts 2
Out of 6 attempts, only 2 files were encoded without artifacts. If this behavior is normal for x264, then this is not a program, it is junk! -
Yes - it's been mentioned several times already.
It's the same behaviour for all encoders. It's the nature of lossy low bitrate encoding.
Adjusting script and encoding settings can help a bit, but at the end of the day, bitrate trumps everything. High bitrate = more consistent for all encoders
You can use something better as suggested, like HEVC (x265), AV1 (AOM-AV1 or SVT-AV1) . They are definitely better - especially at lower bitrates - but not "magical". If your script is not good, and/or you use low bitrates, you will still get problems -
After editing the posts, the messages are on the next page...
Last edited by gelo333; 18th Dec 2023 at 21:57.
Similar Threads
-
ffmpeg: What are the most compatible encoding options?
By Rekrul in forum Video ConversionReplies: 10Last Post: 1st Sep 2022, 06:07 -
In VSDC FREE, How to pop up random words in random screen locations?
By EricBalir in forum Newbie / General discussionsReplies: 0Last Post: 27th Aug 2021, 10:10 -
Editing video and audio - Degradation
By shans in forum EditingReplies: 4Last Post: 30th Aug 2020, 19:48 -
Exporting MPEG2 Video Degradation Blurry Jittery Adobe Premiere
By ESP1138 in forum Video ConversionReplies: 5Last Post: 17th Jan 2020, 09:57 -
Degradation when burning dvd using Freemake Video Converterp
By vinckles in forum Authoring (DVD)Replies: 3Last Post: 7th Oct 2019, 03:34