You are changing my test. Again, when I get the two files from my brother, I KNOW that one is the original and one has been degraded in quality - but I don't know which is which. If I look at the ffmpeg console output of both files after the conversion, I see where one has 3fps and the other has 23fps. I didn't know before the conversion that A was 23fps and that B was going to be 3fps or vice-versa. All I know is that my brother changed ONE thing. I look at the two console outputs and the only difference I see is 23fps vs 3fps. Can't I say that the 3fps file has lesser quality?
Try StreamFab Downloader and download from Netflix, Amazon, Youtube! Or Try DVDFab and copy Blu-rays! or rip iTunes movies!
+ Reply to Thread
Results 31 to 60 of 66
Thread
-
-
-
In all I mentioned I only would change the frame rate or bit rate and keep all other settings the same.
Why do you think that doing your "a" and "b" is doing ONLY ONE THING?!?!?! Again, look at my two commands where I use "-b:v" and "-r"
-------------------------
Can you agree to these statements:
1. Reencoding a video (with a lossy compression; like ffmpeg uses by default) and increasing the bitrate will still lower the viewing quality?
2. Reencoding a video and changing the frame rate will lower viewing quality?
Can't I say that the 3fps file has lesser quality?users currently on my ignore list: deadrats, Stears555 -
Your #1 seems to be doing TWO things: compressing AND changing the bitrate. But, show me how you would accomplish that in ONE ffmpeg statement using ONLY ONE switch other than "-i".
Your #2 seems to be doing TWO things: reencoding AND reducing the frame rate. But, show me how you would accomplish that in ONE ffmpeg statement using ONLY ONE switch other than "-i".
OK, one more try asking a different way. Consider this statement:Code:ffmpeg -i A.mp4 -b:v 20k Z.mp4
Knowing NOTHING else beyond that ONE ffmpeg statement, would you say that: A) A.mp4 is better quality B) Z.mp4 is better quality C) A.mp4 and Z.mp4 are the same quality D) there is nothing in the statement that can tell you any difference in quality in A.mp4 and Z.mp4 (and tell me why)
If you were to know NOTHING else beyond that ONE ffmpeg statement with ONE switch was used (but you don't know which was input and which was output) and you were to ONLY view the ffmpeg console output from A.mp4 and Z.mp4, would you find sufficient information in the console outputs to say that: A) A.mp4 is better quality B) Z.mp4 is better quality C) A.mp4 and Z.mp4 are the same quality D) there is nothing in ONLY the console outputs that can tell you any difference in quality in A.mp4 and Z.mp4 (and tell me why)
Next, consider this statement:Code:ffmpeg -i A.mp4 -r 3 Z.mp4
Knowing NOTHING else beyond that ONE ffmpeg statement, would you say that: A) A.mp4 is better quality B) Z.mp4 is better quality C) A.mp4 and Z.mp4 are the same quality D) there is nothing in the statement that can tell you any difference in quality in A.mp4 and Z.mp4 (and tell me why)
If you were to know NOTHING else beyond that ONE ffmpeg statement where ONLY ONE switch was used beyond "-i" (but you don't know which was input and which was output) and you were to ONLY view the ffmpeg console output from A.mp4 and Z.mp4, would you find sufficient information in the console outputs to say that: A) A.mp4 is better quality B) is better quality C) A.mp4 and Z.mp4 are the same quality D) there is nothing in ONLY the console outputs that can tell you any difference in quality in A.mp4 and Z.mp4 (and tell me why)Last edited by the_steve_randolph; 5th Jun 2021 at 03:09.
-
-
Your #1 seems to be doing TWO things: compressing AND changing the bitrate. But, show me how you would accomplish that in ONE ffmpeg statement using ONLY ONE switch other than "-i".
our #2 seems to be doing TWO things: reencoding AND reducing the frame rate. But, show me how you would accomplish that in ONE ffmpeg statement using ONLY ONE switch other than "-i".ffmpeg -i A.mp4 -r 3 Z.mp4
Do you consider that statement to be changing: A) one thing B) more than one thing C) nothing
a. recompress with the default encoder (currently x264 for mp4 container in ffmpeg iirc)
b. change the bit rate during that recompression to 20k
Knowing NOTHING else beyond that ONE ffmpeg statement, would you say that: A) A.mp4 is better quality B) Z.mp4 is better quality C) A.mp4 and Z.mp4 are the same quality D) there is nothing in the console outputs that can tell you any difference in quality in A.mp4 and Z.mp4 (and tell me why)
If you were to know NOTHING else beyond that ONE ffmpeg statement with ONE switch was used (but you don't know which was input and which was output) and you were to ONLY view the ffmpeg console output from A.mp4 and Z.mp4, would you find sufficient information in the console outputs to say that: A) A.mp4 is better quality B) is better quality C) A.mp4 and Z.mp4 are the same quality D) there is nothing in the console outputs that can tell you any difference in quality in A.mp4 and Z.mp4 (and tell me why)
Cu Selurusers currently on my ignore list: deadrats, Stears555 -
If you are looking at the console output and the ONLY difference you see is the fps, what else could have been changed when the "-r" is the ONLY switch used?users currently on my ignore list: deadrats, Stears555
-
When you do this statement:Code:ffmpeg -i A.mp4 -b:v 20k -c:v copy Z.mp4
When you do this statement:Code:ffmpeg -i A.mp4 -r 3 -c:v copy Z.mp4
-
You are right with the '-b:v 20k -c copy', sorry my mistake. (b:v is a video stream parameter)
You are wrong with '-r 3 -c copy' this works fine, since '-r 3' is a container parameter.users currently on my ignore list: deadrats, Stears555 -
So, is my statement changing ONLY ONE THING and you can tell the difference in the console outputs?
You are wrong with '-r 3 -c copy' this works fine, since '-r 3' is a container parameter. -
So, is my statement changing ONLY ONE THING and you can tell the difference in the console outputs?
You are right I confused '-r' with '-frame rate', so you got frame added or removed.
Still without knowing which is the original you got two options (Assuming A has 23fps and B has 3 fps), either:
1. A was the source and B was produced by reencoding and dropping frames, so B would provide a lower viewing quality.
or
2. B was the source and A was produced by reencoding and adding duplicate frame, so A would provide the lower viewing quality.
Since you can't decide whether the case is 1. or 2. you can't tell which file offers a better viewing quality.users currently on my ignore list: deadrats, Stears555 -
Once again, I AM NOT TRYING TO DETERMINE WHICH WAS THE ORIGINAL!!!!!!!!!!!!!
I simply want to know if I can look at the ffmpeg console output and if I ONLY see a difference in video bitrate and I assume that nothing else has EVER been changed in the two files, is "video bitrate" a measurement of video quality where I can state the the video with the higher bitrate is better quality? -
Concerning your statement #2, why are you saying that "A would provide lower viewing quality"? As stated above, B (converted from A at 23fps) at 3fps is jumpy. If B at 3 fps is the "original" and it was converted to A by changing the fps to 23, then, my understanding is that extra frames are created, but, that they are duplicates of one of the frames next to it. That is, in one second, B frame 1 & 2 & 3 would be converted in A to 1 & 1 & 1 & 1 & 1 & 2 & 2 & 2 & 2 & 2 & 2 & 3 & 3 & 3 etc. until there are 23 frames in that one second. In that case, I would think that those 23 frames in that one second in A would only be 3 DIFFERENT frames would have the same viewing quality as the original 3 frames in B. But, that is not what I am asking/talking about
Now, if you were to have video X that was 23fps and it was converted to video B with 3fps and then B at 3fps was converted to A at 23fps, then, yes, A would look worse than X. But, that is NOT what I have been asking/talking about in this thread. -
Once again, I AM NOT TRYING TO DETERMINE WHICH WAS THE ORIGINAL!!!!!!!!!!!!!
I ONLY see a difference in video bitrate and I assume that nothing else has EVER been changed in the two files, is "video bitrate" a measurement of video quality where I can state the the video with the higher bitrate is better quality?
I know you will complain about me getting to complicated for you, but try to bear with it:
If you have an original O.
And do a reencode A with a bitrate of 20k and a reencode B with a bitrate of 2000k
and you know that:
a. both are reencode of the same source
b. both use the same reencoding configuration aside from the bitrate
then you can take the bitrate as a quality measurement.
But you need both of these (a., b.) to be true.
If you are not sure of a. then one could have first converted O to another file with a bitrate of 10k and then to B using a bitrate of 2000k. In this case A with a bitrate of 20k would have better quality than A with a bitrate of 2000k.
If you are not sure of b. the problem is that one of the encoding parameters might negate the bitrate difference.
-> were you able to understand that?
Cu Selurusers currently on my ignore list: deadrats, Stears555 -
Now, if you were to have video X that was 23fps and it was converted to video B with 3fps and then B at 3fps was converted to A at 23fps, then, yes, A would look worse than X. But, that is NOT what I have been asking/talking about in this thread.users currently on my ignore list: deadrats, Stears555
-
Because you are continuing to misunderstand what I am asking!
I ONLY see a difference in video bitrate and I assume that nothing else has EVER been changed in the two files, is "video bitrate" a measurement of video quality where I can state the the video with the higher bitrate is better quality?
I know you will complain about me getting to complicated for you, but try to bear with it:
If you have an original O.
THERE IS NO "O"!
In my examples, throughout this entire thread, is that A is converted to B. There is no "original", there is no "O". I don't care where A came from. I don't care if A is the original or if some other "O" is "the original". I don't want to think about, know about, or, ask anything here about any file other than A and B. I START with A. I convert it to B by ONLY reducing the video bitrate. How do A and B compare?
And do a reencode A with a bitrate of 20k and a reencode B with a bitrate of 2000k
and you know that:
a. both are reencode of the same source
b. both use the same reencoding configuration aside from the bitrate
then you can take the bitrate as a quality measurement.
But you need both of these (a., b.) to be true.
If you are not sure of a. then one could have first converted O to another file with a bitrate of 10k and then to B using a bitrate of 2000k. In this case A with a bitrate of 20k would have better quality than A with a bitrate of 2000k.
If you are not sure of b. the problem is that one of the encoding parameters might negate the bitrate difference.
-> were you able to understand that?
Cu Selur -
-
In my examples, throughout this entire thread, is that A is converted to B. There is no "original", there is no "O". I don't care where A came from.
I am ONLY asking about converting A to B!users currently on my ignore list: deadrats, Stears555 -
OK, fine, that is understandable. But, beyond A and B, I don't care about any other files.
I am ONLY asking about converting A to B!
Sure, if the conversion was to INCREASE the fps from 3 to 23 (your example), the playback viewing quality would not be different. But, that increase would cause the converted file to have a larger O/S size. Why would somebody want to do that? My guess is that people are reencoding a video to decrease the size of the file (for a multitude of reasons), not increase the size without increasing quality (which is your example). Since doing this does not make any sense, why can't we assume that it doesn't happen for my supposedly simple/basic question here?
I am not looking for all of the possible combinations or individual things that some idiot could do to a video. As stated in the very first sentence of my OP, I am new to this and trying to learn about what I would consider to be very basic information that I can start with when I want to compare two files. That is, from what I have learned from this thread and subsequent research, if the ONLY difference between two files after a conversion between those two files is bitrate, then, I should (in the vast majority of "reasonable" cases) be able to tell from ffmpeg console output that the file with the higher bitrate is better quality. And, fps is another basic quality measurement. If we can agree on those two measurements given the limited scenario that I am presenting in this thread, are there any other basic quality measurements that can be isolated in a conversion and where I am able to see that measurement difference in ffmpeg console output? -
In general, yes, where you don't know anything about the origin of the two files. But, again, that is not what I am asking about. I am trying to implement "the scientific method" or WTH is it called where you compare two things where you KNOW that they did not start as two different things and there is only one thing that is different about the two things. For this test/question/thread, the assumption is that A was converted to B. Nothing else.
**** EDIT: OK, let me try that again. I am trying to create a scenario where everything not stated in the test/experiment is ignored and/or assumed to be something reasonable. The universe for this scenario is that there are these two facts: 1) there are two video files and 2) one was converted from the other with a single ffmpeg using a single switch (other than "-i") and 3) this conversion was done to reduce the quality. The test subject (me) does not know anything about the files except for those three facts only. What are the possible quality measurement(s) visible in ffmpeg console output that I can use to determine which of the two files has the better quality?Last edited by the_steve_randolph; 5th Jun 2021 at 05:03. Reason: clarifications
-
If my brother converts A to B by REDUCING the bitrate or the fps, changes the names of the files to X and Y without telling me which is the "original", I can look at the console output and see that one of X or Y has a higher bitrate or fps. That is a fact.
One will have 23 fps and the other will have 3fps.
. And, knowing that INCREASING the bitrate or fps does not actually increase video quality (and discarding that possibility in my examples/questions) and knowing that bitrate or fps is the ONLY difference I see in the console outputs, why can't it be stated as fact that of X and Y, that the file with the higher bitrate or fps has better quality with everything else being equal?
By adding/dropping frames.
By the needed recompression.
Sure, if the conversion was to INCREASE the fps from 3 to 23 (your example), the playback viewing quality would not be different.
But, that increase would cause the converted file to have a larger O/S size.
Why would somebody want to do that?
-> do you want to add as additional knowledge that the person who did the conversion only did objectively meaningful things?
My guess is that people are reencoding a video to decrease the size of the file (for a multitude of reasons), not increase the size without increasing quality (which is your example).
Since doing this does not make any sense, why can't we assume that it doesn't happen for my supposedly simple/basic question here?
If you assume that then the only reason to change the fps should be to be compatible with some playback device.
So adding duplicate frames might be needed to archive a specific fps since the player wouldn't playback the content otherwise and any fps different from the the supported fps of the player would indicate a low quality version. (not caring about the image quality)
Same with the bitrate, if you assume the content of the video is the same (color, resolution, etc. both have no artifacts which would require higher bitrate or lower quality) and whoever reencoded used the optimal average bitrate then bitrate would be a proper metric.
I am not looking for all of the possible combinations or individual things that some idiot could do to a vido
If we can agree on those two measurements given the limited scenario that I am presenting in this thread, are there any other basic quality measurements that can be isolated in a conversion and where I am able to see that measurement difference in ffmpeg console output?
Then you can say:
a. higher bitrate = better quality
b. higher color sampling = better quality
c. higher frame rate = better quality
d. vfr = higher quality
basically you always can say 'higher value' = 'higher quality'.
Cu Selurusers currently on my ignore list: deadrats, Stears555 -
I am trying to implement "the scientific method" or WTH is it called where you compare two things where you KNOW that they did not start as two different things and there is only one thing that is different about the two things. For this test/question/thread, the assumption is that A was converted to B. Nothing else.users currently on my ignore list: deadrats, Stears555
-
I am not overlooking these possibilities. I am trying to tell you that I am wanting to reduce the variables in my questions/examples to drill down to some basic information
Sure, if the conversion was to INCREASE the fps from 3 to 23 (your example), the playback viewing quality would not be different.
But, that increase would cause the converted file to have a larger O/S size.
Why would somebody want to do that?
-> do you want to add as additional knowledge that the person who did the conversion only did objectively meaningful things?
My guess is that people are reencoding a video to decrease the size of the file (for a multitude of reasons), not increase the size without increasing quality (which is your example).
Since doing this does not make any sense, why can't we assume that it doesn't happen for my supposedly simple/basic question here?
If you assume that then the only reason to change the fps should be to be compatible with some playback device.
So adding duplicate frames might be needed to archive a specific fps since the player wouldn't playback the content otherwise and any fps different from the the supported fps of the player would indicate a low quality version. (not caring about the image quality)
Same with the bitrate, if you assume the content of the video is the same (color, resolution, etc. both have no artifacts which would require higher bitrate or lower quality) and whoever reencoded used the optimal average bitrate then bitrate would be a proper metric.
I am not looking for all of the possible combinations or individual things that some idiot could do to a vido
If we can agree on those two measurements given the limited scenario that I am presenting in this thread, are there any other basic quality measurements that can be isolated in a conversion and where I am able to see that measurement difference in ffmpeg console output?
Then you can say:
a. higher bitrate = better quality
b. higher color sampling = better quality
c. higher frame rate = better quality
d. vfr = higher quality
basically you always can say 'higher value' = 'higher quality'.
Cu Selur
GREAT! That's what I was trying to ask for. Now, I will go visit Google and ask things like "ffmpeg change bitrate" and "ffmpeg change 'color sampling'" or "ffmpeg change 'frame rate'" or "ffmpeg change vfr". Prior to my OP I didn't know what terms and/or attributes (i.e. bitrate, fps, vfr, etc.) to use to get anything out of Google. Therefore, I started here with my (destined to be) imperfect OP, thinking that asking video "experts" on "videohelp.com" was a good place to start instead of hoping that Google knew what I was asking (which we know that many times it does not). I actually did start before the OP with Googling things like "ffmpeg video quality" and the results were using terminology that I didn't know that definition of (like "color sampling"), and, there were tons of results and I had no clue which results to read and try to understand. -
"Assumption" is the wrong word for me to use in that statement. As stated recently, that is a basic "fact" for this thread/example question. I thought I had made that clear throughout the thread, but, I guess not.
**** EDIT: In my defense, go back and reread post #11 in this thread. I thought, and still think, that was a very basic/simple scenario. But, the responses I got kept making the scenario more complicated and using terminology that I didn't know. Again, I was trying to crawl and y'all thought I was trying to run.Last edited by the_steve_randolph; 5th Jun 2021 at 06:00.
-
How would the playback viewing quality be different?
To some the content seems not as smooth, which usually happens especially if the player/display would otherwise interpolate frames for smoother playback or prefers specific frame rates.
Then, we don't agree. Isn't converting from 3fps to 23fps creating duplicate frames? Aren't these duplicated frames using up more O/S file space?
How do you "start" when you are new at something
If you understand how compression works, you should understand what:
a. different bit rates and rate control means
b. what different color samplings/spaces mean
Atm. you are not starting with the basics.
Cu Selurusers currently on my ignore list: deadrats, Stears555 -
OK, now that we seem to have gotten the first part of this thread cleared up, I can move to the second part.
This "project" is to figure out the best way "for me" to determine acceptable quality for me to copy DVDs and computer videos to disc for future watching. Ultimately, I will come to a conclusion as to whether or not doing this is a better longterm solution compared to streaming. A component of this is acknowledging that we simply don't know what "streaming" will be like in 10-20 years. But, right now, I am in possession of DVDs and computer videos. To watch them, I don't have to buy a ticket at the theater or have internet service and/or have a streaming service. They are in my grubby hands, and, I have a real good handle on how reliable computer HDDs and disc drives are. So, if I want to burn/rip/whatever my videos to disc, I want to try to move them to the discs in a way that does not lose "obvious" viewing quality. Right now, I can test on a computer HDD, internal disc drive, monitor, TV, and, external disc player connected directly to the TV. I know that I can tweak the quality via numerous measures (i.e. video bitrate, fps, etc.) to reduce the size of the files residing on discs. So, I am trying to find out what kind of trade-offs can be made to put them on discs, trying to not lose too much quality, and rely on the discs to be a better solution than whatever streaming might be like in 10-20 years.
Yes, I cannot predict the future, but, I can find out what I can do to be more efficient on discs but not see the difference in the degraded quality when I view it on my monitor and on my 55" TV. Yes, this is subjective in what is acceptable to me compared to others. But, I want to limit the universe to what I have right now and do some testing to determine what is acceptable to me. So, I will now try to ask some narrow questions to help move me along and help keep me away from wild goose chases (which I was doing a week ago on a different issue).
If I were to take a single quality attribute like bitrate or fps or frame rate, and convert the video by reducing only this quality, what are some reasonable (i.e. "rule of thumb") increments that will make testing better and easier? For example, if I am starting with a video bitrate that is 2000 kb/s, should I reduce that by 200 kb/s increments? Or, would it be better to go in increments of 300 or 400?. What I will do is play the original next to the converted file on my 20" monitor and see if I can tell the difference. Like, would the average person be able to see a difference once it reaches 1200 kb/s? 900 kb/s? Like, if I probably wouldn't see a difference until it gets down to 500 kb/s, then, me testing in increments of 100 kb/s would be wasting a lot of time as I go from 200 to 1900 to 1800 to 1700 to 1600 and so on. Then, I would do the same testing on my 55" TV. (For this discussion, the quality of the cable to my monitor is the same quality as to my TV, and, both the monitor and TV are 1080p). An assumption is that I would notice a difference on my TV at a higher bit rate than I saw on my monitor. That is, for example, I might end up determining that when I convert to 1000 kb/s that is OK for watching on a 20" monitor, but, for me, I can't go below 1500 kb/s when viewing on my 55" TV.
So, I am not asking for exact numbers and/or view settings because I know that the end result is going to be subjective. I am asking to get pointed in the right direction and for some terminology/settings to look at first. Again, I can only test on the equipment that I have at this moment. I understand that there can be device compatibility issues in the future, but, that is something I didn't know to look for when I started this thread. I am not really interested in editing the quality of the video/audio or "remastering" or fixing any quality attribute (beyond perhaps only a screwed up aspect ratio). At this time, I am not really interested in discussing/debating the comparison of various codecs, for example - unless there is a major difference, like, subtitles can be text-based or image-based. I am not interested in converting surround sound to stereo or more complicated/personal editing like that. As stated above, I entered this thread crawling. I might be doing a bit of walking right now, but, I am not ready for running yet. Just get me pointed in the right directions. Thanks. -
Since you do not want to restore/enhance content:
I would advice keep the originals, since displays get higher resolutions, better color precision anything that might look okay on your tv now may look bad on a new one.
(depending on electricity and hardware costs, usually buying new hdds is cheaper than spending the time and energy on converting stuff)
If you want to get rid of your DVDs copy images of them to your hdd, if you do not care about the menus on DVDs, use for example MakeMKV to repackage them to mkv files.
If you want to convert.
a. to not mess with the frame rate unless you deinterlace or ivtc and you know what you are doing. (which at this point you do not)
b. do not lower the resolution (anything that is lost, is lost)
c. do not create vfr content
What I usually would recommend:
a. use a tool like StaxRip, MeGui, Handbrake
b. use x264 or x265 as encoder and use the 'slow'-preset; depending on your Hardware you can also try hardware encoders which usually help with the encoding speed, but provide a worse compression ratio (bitrate vs. quality)
c. use crf / 'constant quality' encoding create a bunch of encodes with crf values between 15 and 23 (do not use 1pass vbr, abr, cbr encoding; use 2pass encoding if you need to archive a specific file size)
-> look at the results and stick with the crf value that gives you a good balance between file size and quality.
Once you get a general 'feel' for your 'size vs. quality' perception, learn how to filter content and tweak settings.
Cu Selurusers currently on my ignore list: deadrats, Stears555 -
You left out a bit of my quote and again added more variables to the testing. If 3fps is converted to 23fps and EVERYTHING else is the same, how would somebody see a difference in quality? Are you saying that, if the only difference in two computer videos is that one has duplicated frames, there are specific devices that would not play the duplicated frames the same way as the non-duplicated frames?
Then, we don't agree. Isn't converting from 3fps to 23fps creating duplicate frames? Aren't these duplicated frames using up more O/S file space?
How do you "start" when you are new at something
If you understand how compression works, you should understand what:
a. different bit rates and rate control means
b. what different color samplings/spaces mean
Similar Threads
-
What Is The Best Video Codec For Small File Size With Best Video Quality??
By DJboutit in forum Video ConversionReplies: 2Last Post: 14th Feb 2021, 07:58 -
Poor Video Quality in Digitized Sony Video8 Tapes (Video Attached)
By wsd33904 in forum CapturingReplies: 7Last Post: 16th Jan 2021, 18:47 -
analog video capture - artefacts, poor video quality, lost connections
By kaeffchen_heinz in forum CapturingReplies: 8Last Post: 5th Nov 2020, 08:39 -
Incorrect video cut timing on MKV video with lossless quality
By may5224 in forum EditingReplies: 8Last Post: 19th Nov 2019, 15:02 -
Can VCR video be converted to quality high definition video?
By johnharlin in forum Video ConversionReplies: 3Last Post: 22nd Jan 2017, 16:51