VideoHelp Forum
+ Reply to Thread
Page 1 of 3
1 2 3 LastLast
Results 1 to 30 of 66
Thread
  1. I am pretty new to converting/modifying computer video files and I am researching all sorts of aspects of the files. Here I just want to ask a few overall questions concerning the quality of a computer (non-DVD) video file. For example, as many of you know, there are many movies that are in the public domain because their copyrights have expired. And, there are various web sites that have these movies for free viewing and/or downloading. Also, some are sold on DVD for $1 at stores like Big Lots and Dollar General. Occasionally one would run across more than one copy of the same movie. There are various reasons for that, but, that is beyond the scope of my post.

    So, looking at the Windows properties, these copies of the same movie could have the same frame dimensions (i.e. 640x480) and the same length of time, but, the O/S size of the files could be different sizes. I understand that this size difference could be because one could have more audio track(s) or subtitle track(s). But, the most obvious reason could be because of varying video quality. And, I mean different video quality as a computer file - not the video quality of the source (an old film print, Ted Turner's movie colorizing binge, etc.) of the different copies. Also, some copies could be remastered where the audio and/or video quality was modified during the conversion process.

    So my first question has to do with: how can I compare two computer video files and tell which one has better audio and/or video quality?

    In passing, somewhere on the interwebs, I have run across some information where the idea was to look at a "bitrate". I think that both audio and video have a bitrate. So, what app(s) can tell me whatever this appropriate "quality" information is and what do I look for in each of those apps?

    Also, and this may be subjective, but, is there a formula or accepted rule-of-thumb to determine what is a discernable difference in quality? For example, if video #1 was 25% better quality (by whatever objective measurement) than video #2, would I be able to tell the difference playing the two videos on a 20" monitor with VLC? What would that percentage difference be on a 20" monitor compared to playing the same videos burned to a DVD on a 55" TV?

    That last issue is a concern if the two files don't necessarily have the same content. That is, one copy may have a scene deleted, or, contain "lost footage" that is missing from another copy. I might end up with a situation where I want to keep a longer unedited version, but, it might be lesser audio and/or video quality than another copy that has been edited (and but "edited", I mean the original source of the video and not somebody like me doing a conversion). There is also the issue of the two copies being different when it comes to chapters (i.e. existing or not) and subtitle track(s).

    Lastly, I understand that some aspects of converting a computer video file involve re-encoding parts of the file. And, I know that re-encoding can reduce the quality of the audio and/or video tracks. I also understand that that formats like MKV, MP4, and, AVI are "containers" and the real quality is related to the codecs used inside the containers. So, related to comparing the quality of two files, are there specific containers and/or codecs that inherently have better or worse quality?

    Ultimately I see myself burning these files onto discs to save HDD space. And, I would like the ability to play them on a computer or a DVD/Blu-ray player connected to a TV. I understand that not everyone wants to do things that way, but, I would prefer that we keep the discussion to trying to achieve what I have stated here instead of trying to convince me to have access to these movies in a different form (i.e. streaming). BTW, Google is my friend and have already used it for researching this information. But, I am not opposed to a reply/answer being a link that provides the answers to my questions.

    Thanks.
    Quote Quote  
  2. I guess another way to approach this question is: when an app compresses a video file, what is it changing?
    Quote Quote  
  3. Kawaiiii
    Join Date
    May 2021
    Location
    Italy
    Search Comp PM
    Originally Posted by the_steve_randolph View Post
    I guess another way to approach this question is: when an app compresses a video file, what is it changing?
    Problem is not about compression. Compression is simply like asking.. "how MUCH I am ready to lose ?"

    Ascertain video quality (in an absolute way) is a different kind of stuff... and no algorithm can give you help on this... I suppose.
    Quote Quote  
  4. Originally Posted by krykmoon View Post
    Problem is not about compression. Compression is simply like asking.. "how MUCH I am ready to lose ?"

    Ascertain video quality (in an absolute way) is a different kind of stuff... and no algorithm can give you help on this... I suppose.
    I understand that it MIGHT not be about compression. But, my OP asked how to compare two videos to determine which would/might have lesser quality. So, doesn't compression reduce the quality? Otherwise, there is no downside to compression (which I doubt). And, my OP was not about asking "how MUCH I am ready to lose?". The secondary question WAS asking how much of a reduction in quality will be noticeable on a 20" monitor and a 55" TV. Again, if you go back to the OP, I am not trying to change the quality. I am trying to find out how to find out the quality information.

    I did a little research on compression and there seems to be two main types. One reduces the number of bits, which would seem to reduce video bitrate. The second reduces the number of frames, which would seem to be, for example, the idea of deleting every other frame and reducing 30fps to 15fps. In my opinion, a smaller bitrate and/or frames per second would be a reduction in quality - basically what I asked in the OP. And, that doesn't seem to involve an "algorithm".
    Last edited by the_steve_randolph; 2nd Jun 2021 at 01:58.
    Quote Quote  
  5. Kawaiiii
    Join Date
    May 2021
    Location
    Italy
    Search Comp PM
    Originally Posted by the_steve_randolph View Post
    I understand that it MIGHT not be about compression. But, my OP asked how to compare two videos to determine which would/might have lesser quality. So, doesn't compression reduce the quality?
    It exclusively depends on the source material and the parameters you choose when encoding. A source material with little detail or very blurred can be compressed a lot without any quality loss, while sharper video with lot of detail can't, unless you go up with the highest quality settings (btw: blurays use compression too).

    I did a little research on compression and there seems to be two main types. One reduces the number of bits, which would seem to reduce video bitrate. The second reduces the number of frames, which would seem to be, for example, the idea of deleting every other frame and reducing 30fps to 15fps. In my opinion, a smaller bitrate and/or frames per second would be a reduction in quality - basically what I asked in the OP. And, that doesn't seem to involve an "algorithm".
    While reducing the frame rate that way will simply DESTROY the source material .. the bitrate may be helpful to evaluate the quality of a video.. but also it may be a deceiver.
    A very noisy video may have a CRAZY HUGE bitrate (due to noise itself) but look terrible and have much lower quality than a clean video with half or a quarter of the bit rate of the noisy one.

    That's what I was trying to explain: there's not a single indicator that can let you assure that a video is better than another. Softwares and tools can help evaluate some aspects (colors and luminosity, for example.. they're both extremely important parameters for good video quality) but there's not an "automatic" easy way to determine it based on the mere evaluation of the parameter itself (see the example above with the bit rate)
    Last edited by krykmoon; 2nd Jun 2021 at 09:45.
    Quote Quote  
  6. Originally Posted by krykmoon View Post
    While reducing the frame rate that way will simply DESTROY the source material .. the bitrate may be helpful to evaluate the quality of a video.. but also it may be a deceiver.
    A very noisy video may have a CRAZY HUGE bitrate (due to noise itself) but look terrible and have much lower quality than a clean video with half or a quarter of the bit rate of the noisy one.

    That's what I was trying to explain: there's not a single indicator that can let you assure that a video is better than another. Softwares and tools can help evaluate some aspects (colors and luminosity, for example.. they're both extremely important parameters for good video quality) but there's not an "automatic" easy way to determine it based on the mere evaluation of the parameter itself (see the example above with the bit rate)
    I think you have misunderstood my premise. And, since you are the only one responding to my OP, I don't see any point in my continuing here.
    Quote Quote  
  7. Originally Posted by the_steve_randolph View Post

    I think you have misunderstood my premise. And, since you are the only one responding to my OP, I don't see any point in my continuing here.
    You're not getting a lot of participation because the answer you're getting is correct. It appears you are looking for a quantitative answer to a judgement call. Yes, generally more is better, but as has been explained that is not the whole story. The way to judge "quality" is to look and listen.
    Quote Quote  
  8. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    The best way to compare video/audio quality between 2 videos is to watch one after the other,nothing else can tell you the better quality but your own view.
    I think,therefore i am a hamster.
    Quote Quote  
  9. This has been discussed on this and other forums many times, ad nauseam.

    How do you define "quality"?

    Is it which looks "better"?

    If so, how do you define "better"?

    Is it which one is closer to the source?

    If so, do you care about being mathematically more similar or do you care, as some people argue, about which is more perceptually similar?

    Just go read any of the numerous threads about this topic.
    Last edited by sophisticles; 4th Jun 2021 at 20:07.
    Quote Quote  
  10. Originally Posted by krykmoon View Post
    there's not a single indicator that can let you assure that a video is better than another.
    That says it all. You have to compare them with your own eyes/ears. Except for some rare circumstances where you know exactly how they were produced -- in which case you probably already know which is better.
    Quote Quote  
  11. OK, it seems that more of you are misunderstanding the premise and trying to make it WWWAAAYYYYY too complicated. So, let me start over and simplify:

    I start with an MP4 (let's call it "A.mp4")
    I tell my brother to convert "A.mp4" to another MP4 (let's call it "B.mp4"):
    - using ONLY ffmpeg
    - and changing ONLY one "quality" attribute
    - that "quality" attribute can be seen in "ffmpeg -i input.mp4" console output
    - he cannot do things like add/remove video or audio tracks, chapters, or subtitles
    He renames the two files X.mp4 and Y.mp4, not telling me which file is which
    I look at the two files in Windows Explorer, but, NOT allowed to look at the O/S size (in MBs) of the two files:
    - X.mp4 and Y.mp4 have the same dimensions (let's say 640x480)
    - X.mp4 and Y.mp4 have the same length of time (lets say 1:23:45)
    I am allowed to ONLY use the commands "ffmpeg -i X.mp4" and "ffmpeg -i Y.mp4" to compare the quality differences in X.mp4 and Y.mp4
    So, I do those ffmpeg commands and view the console outputs

    Now, what information in the two ffmpeg console outputs will tell me which one has the better quality?

    Once I see the answer(s) to this question, I will re-ask the "view the videos" question so that it will make more sense.
    Last edited by the_steve_randolph; 5th Jun 2021 at 01:00.
    Quote Quote  
  12. I'm a Super Moderator johns0's Avatar
    Join Date
    Jun 2002
    Location
    canada
    Search Comp PM
    You are not listening to anyone here,nothing will tell you the quality difference in any output controls,it's up to the user to see which video looks better.
    I think,therefore i am a hamster.
    Quote Quote  
  13. Now, what information in the two ffmpeg console outputs will tell me which one has the better quality?
    None.
    "ffmpeg -i" the output has no information about quality.
    It just shows some general information from the headers of the files.
    This is like wanting to know how generous two persons are just based on their name and knowing that they have the same height, eye color, hair color and that they both hate cats.

    There is no method known method to objectively measure the 'quality' of of something.
    Even if you compare something directly to a known source, just be the data from the headers you can't say anything reliable about quality.
    users currently on my ignore list: deadrats, Stears555
    Quote Quote  
  14. Originally Posted by johns0 View Post
    You are not listening to anyone here,nothing will tell you the quality difference in any output controls,it's up to the user to see which video looks better.
    So, for example, changing the video bitrate from 2000 kb/s to 200 kb/s would not necessarily reduce the video/viewing quality? That is visible in the ffmpeg console output.
    Last edited by the_steve_randolph; 5th Jun 2021 at 02:11.
    Quote Quote  
  15. Originally Posted by Selur View Post
    Now, what information in the two ffmpeg console outputs will tell me which one has the better quality?
    None.
    "ffmpeg -i" the output has no information about quality.
    It just shows some general information from the headers of the files.
    This is like wanting to know how generous two persons are just based on their name and knowing that they have the same height, eye color, hair color and that they both hate cats.
    So, changing the video bitrate from 2000 kb/s to 200 kb/s -or- changing the fps from 30 to 3 -or- changing the audio bitrate from 320 kb/s to 32 kb/s is not changing the viewing quality? Those things are visible in the ffmpeg console output.
    Last edited by the_steve_randolph; 5th Jun 2021 at 02:11.
    Quote Quote  
  16. So, for example, changing the bitrate from 2000 kb/s to 500 kb/s would not necessarily reduce the video/viewing quality?
    You are using prior knowledge here. The knowledge that one video was encoded from the other.

    If I start with a video that has a bitrate of 500 kb/s and convert it to a video with a 50 000 kb/s, the 50 000kb/s video will not look better.
    Bitrate and all those information will not say anything about quality.

    Cu Selur

    Ps.: Also taking an uncompressed source, compressing it with a lossless compression method will reduce the bitrate, but since the compression was lossless there is no quality loss.
    users currently on my ignore list: deadrats, Stears555
    Quote Quote  
  17. Originally Posted by Selur View Post
    So, for example, changing the bitrate from 2000 kb/s to 500 kb/s would not necessarily reduce the video/viewing quality?
    You are using prior knowledge here. The knowledge that one video was encoded from the other.
    No, I am using that as an example of a specific attribute that I can see in ffmpeg console output. In my simplification, my brother has that prior knowledge - not me. So, if he reduced the bitrate from 2000 kb/s to 200 kb/s, isn't he changing the "quality" of the video? And, one would assume that a large number of people should be able to visually/subjectively tell that the 200 kb/s version is lesser quality, without knowing what was changed and by how much, right? If this assumption is not correct, why are DVDs and Blu-rays at a much higher bitrate than 200 kb/s? Why wouldn't they manufacture those discs at 200 kb/s and use a 700MB CD disc (less expensive) instead of a 50GB disc (more expensive)?

    [QUOTE=Selur;2621879]
    If I start with a video that has a bitrate of 500 kb/s and convert it to a video with a 50 000 kb/s, the 50 000kb/s video will not look better.
    Bitrate and all those information will not say anything about quality.
    I understand that a conversion cannot improve the video/viewing quality. And, that is not something that I am trying to do. Simply, if I come across two versions of the same movie where one copy is a degraded quality (i.e. bitrate) of the original, and that is the ONLY change, can I tell the difference with ffmpeg console output?

    [QUOTE=Selur;2621879]
    Cu Selur

    Ps.: Also taking an uncompressed source, compressing it with a lossless compression method will reduce the bitrate, but since the compression was lossless there is no quality loss.
    Can ffmpeg do compression? If so, how? And, what will be different in the console outputs of the before and after files?
    Last edited by the_steve_randolph; 5th Jun 2021 at 02:15.
    Quote Quote  
  18. In my simplification, my brother has that prior knowledge - not me.
    Wonderful. Keep that in mind.
    You get a video A with a bitrate of 2000 and a video B video a bitrate of 500.
    So how will you differentiate whether:
    a. video B was created by encoding video A with a bitrate of 500
    or
    b. video A was created by encoding video B with a bitrate of 2000
    -> you can't with the information you have

    I understand that a conversion cannot improve the video/viewing quality.
    That depend on you perspective of quality. If everybody thought that way, video restoration would be a waste of time.

    Simply, if I come across two versions of the same movie where one copy is a degraded quality (i.e. bitrate) of the original, and that is the ONLY change, can I tell the difference with ffmpeg console output?
    Only if you are sure that that quality measurement parameter:
    a. is really a quality measurement parameter (This is what it's all about and where everybody here is saying that none of the information you get from 'ffmpeg -i' is.)
    b. that the quality measurement parameter is shown in the 'ffmpeg -i' output

    You are making this way more complicated that what I am asking about.
    You are trying to simplify a complex problem while not really having understood the underlying methods and meanings.

    Cu Selur
    users currently on my ignore list: deadrats, Stears555
    Quote Quote  
  19. Originally Posted by Selur View Post
    In my simplification, my brother has that prior knowledge - not me.
    Wonderful. Keep that in mind.
    You get a video A with a bitrate of 2000 and a video B video a bitrate of 500.
    So how will you differentiate whether:
    a. video B was created by encoding video A with a bitrate of 500
    or
    b. video A was created by encoding video B with a bitrate of 2000
    -> you can't with the information you have
    No, you are making my example more complicated than it is. A video "encoded" by reducing the bitrate from 2000 to 200, by itself, without knowing anything else about which file is which and that nothing else was changed, is the 2000 bitrate version better quality than the 200 bitrate version?

    That depend on you perspective of quality. If everybody thought that way, video restoration would be a waste of time.
    Again, you are making this more complicated than what I am asking. I am NOT talking about restoration, remastering, colorizing, or, ANYTHING else that you can do with an app more sophisticated than ffmpeg.


    Only if you are sure that that quality measurement parameter:
    a. is really a quality measurement parameter (This is what it's all about and where everybody here is saying that none of the information you get from 'ffmpeg -i' is.)
    b. that the quality measurement parameter is shown in the 'ffmpeg -i' output
    BINGO!!!!!!!! THIS IS WHAT I AM ASKING FOR!!!!!! If video bitrate is the ONLY, and, I repeat for the MILLIONTH time, the ONLY thing changed, is that a measure of quality? If not, what is it?
    Quote Quote  
  20. This is an example of the type of comparison similar to what I am asking: https://jnduquesne.medium.com/video-bench-how-measure-your-video-quality-easily-85a0feb8f6e2

    ...and, bitrate is mentioned as a measure of quality
    Quote Quote  
  21. No, you are making my example more complicated than it is. A video "encoded" by reducing the bitrate from 2000 to 200, by itself, without knowing anything else about which file is which and that nothing else was changed, is the 2000 bitrate version better quality than the 200 bitrate version?
    Yes, reencoding a video to lower bitrate will potentially lower the quality of the video.
    But the other way round is not a given.
    - It does not mean that the 2000 bit rate version has better quality.
    - It does also not mean reencoding a 2000 bit video to a 4000 bit video video will increase the quality.
    - It does also not mean reencoding a 2000 bit video while not changing the bitrate will not lower the quality.

    BINGO!!!!!!!! THIS IS WHAT I AM ASKING FOR!!!!!! If video bitrate is the ONLY, and, I repeat for the MILLIONTH time, the ONLY thing changed, is that a measure of quality?
    No, but the explanation was dismissed as to complicated.
    So, yes if you define that quality is measured by bitrate than sure. But like stated before that is not the case, bitrate is not an objective measurement of quality.
    If not, what is it?
    Nothing that "ffmpeg -i" shows as output.


    Cu Selur
    users currently on my ignore list: deadrats, Stears555
    Quote Quote  
  22. This is an example of the type of comparison similar to what I am asking:

    https://jnduquesne.medium.com/video-bench-how-measure-your-video-quality-easily-85a0feb8f6e2
    There methods like PSNR etc. are used to measure quality between an original and a reencode, but 'ffmpeg -i' does not show any of those metrics.
    All those metrics compare a source to a reencode, the do not measure an absolute quality, but a quality loss between a source and a reencode.
    So it requires to knowledge to know whether video A or B is the source and like you stated: "my brother has that prior knowledge - not me."

    Cu Selur
    users currently on my ignore list: deadrats, Stears555
    Quote Quote  
  23. Member
    Join Date
    May 2005
    Location
    Australia-PAL Land
    Search Comp PM
    If you have a 2000kbps video and re-encode it to 200kbps, the quality will be reduced. And you'll notice it!

    Or, if you have a master video, the 2000kbps version will look better than a 200kbps version.

    Assuming same codec, no other processing, of course.

    If you're looking at different codecs, a 2000kbps h264 will look about the same as a 1000kbps h265, roughly speaking.

    Quote Quote  
  24. Originally Posted by Selur View Post
    No, you are making my example more complicated than it is. A video "encoded" by reducing the bitrate from 2000 to 200, by itself, without knowing anything else about which file is which and that nothing else was changed, is the 2000 bitrate version better quality than the 200 bitrate version?
    Yes, reencoding a video to lower bitrate will potentially lower the quality of the video.
    But the other way round is not a given.
    - It does not mean that the 2000 bit rate version has better quality.
    - It does also not mean reencoding a 2000 bit video to a 4000 bit video video will increase the quality.
    - It does also not mean reencoding a 2000 bit video while not changing the bitrate will not lower the quality.
    Why do you continue to bring up the idea of trying to INCREASE the quality? AGAIN, that is not what I am trying to do.

    No, but the explanation was dismissed as to complicated.
    So, yes if you define that quality is measured by bitrate than sure. But like stated before that is not the case, bitrate is not an objective measurement of quality.
    If not, what is it?
    Nothing that "ffmpeg -i" shows as output.


    Cu Selur
    OK, consider a file "A.mp4" where ffmpeg console output shows that the video bitrate is "1078 kb/s". Then, I run this command:

    Code:
    ffmpeg -i A.mp4 -b:v 20k Z.mp4
    When I look at the ffmpeg console output of Z.mp4, I can see where the video bitrate is "20 kb/s". Then, I view Z.mp4 with VLC. The video appears to be more "blocky" than A.mp4. Why is this not an obvious reduction in quality? If it is a reduction in quality, I can see IN THEIR FFMPEG CONSOLE OUTPUT where A.mp4 is a bitrate of 1079 kb/s and the video bitrate of Z.mp4 is 20 kb/s.
    Quote Quote  
  25. Originally Posted by Alwyn View Post
    If you have a 2000kbps video and re-encode it to 200kbps, the quality will be reduced. And you'll notice it!

    Or, if you have a master video, the 2000kbps version will look better than a 200kbps version.

    Assuming same codec, no other processing, of course.
    Why are you the only (other) one here who believes that?
    Quote Quote  
  26. Why do you continue to bring up the idea of trying to INCREASE the quality? AGAIN, that is not what I am trying to do.
    If you want a criteria to say "B is better than A" it also needs to be able to conclude "A is worse than B" otherwise it won't work as a discerning criteria.

    Why are you the only (other) one here who believes that?
    Most folks here believe even, that if you use a lossy compression you will always lose information and thus quality and thus when comparing two videos where one is a reencode of the other it's only a question to figure out which is the original,....

    Cu Selur
    users currently on my ignore list: deadrats, Stears555
    Quote Quote  
  27. Consider this command:
    Code:
    ffmpeg -i A.mp4 -r 3 Z.mp4
    When I view Z.mp4 in VLC, the video is jumpy, kinda like stop-motion photography. When I look at the ffmpeg console outputs, I see where the fps for A.mp4 is 23.98 and the fps for Z.mp4 is 3. Why is this not a measurement of video quality that I can see in ffmpeg console output?
    Quote Quote  
  28. Originally Posted by Selur View Post
    Why do you continue to bring up the idea of trying to INCREASE the quality? AGAIN, that is not what I am trying to do.
    If you want a criteria to say "B is better than A" it also needs to be able to conclude "A is worse than B" otherwise it won't work as a discerning criteria.
    <sigh> The premise is that I run across two computer videos files where one of the files is the result of REDUCING the quality of the other. So, yes, you can say "B is better than A" and "A is worse than B", but, in my premise here, the difference is because of REDUCING the video quality.

    Why are you the only (other) one here who believes that?
    Most folks here believe even, that if you use a lossy compression you will always lose information and thus quality and thus when comparing two videos where one is a reencode of the other it's only a question to figure out which is the original,....
    Again, you are introducing more than one factor in the ffmpeg command. In my premise (and examples), I am only changing ONE thing. Look at my examples of using "-b:v" and "-r".
    Quote Quote  
  29. Why is this not a measurement of video quality that I can see in ffmpeg console output?
    Because you make the assumption that the original video isn't already 3fps.
    If A already has a fps of 3 the above command would not change the frame rate and thus no additional choppiness' would be added.
    You are using information which, in your example only your brother has.

    If A has a frame rate of 1fps the above call could speed-up the playback by a factor of 3 which would probably cause as decreasing the viewing quality, wouldn't it?
    users currently on my ignore list: deadrats, Stears555
    Quote Quote  
  30. The premise is that I run across two computer videos files where one of the files is the result of REDUCING the quality of the other. So, yes, you can say "B is better than A" and "A is worse than B", but, in my premise here, the difference is because of REDUCING the video quality.
    Yes, but I can reduce the playback quality of a video by:
    a. reencoding it while increasing the bitrate (simply because the compression isn't lossless)
    b. increasing the frame rate

    Again, you are introducing more than one factor in the ffmpeg command
    No, I'm not. No clue why you think that.
    users currently on my ignore list: deadrats, Stears555
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!