Hi, all. I have a question regarding the bitrate viewer and how it relates to visual quality. I encoded a short clip framedserved with Avisynth into Tmpgenc using MPEG1 compression, and I compared the output using different filters to see if I could see the difference in VirtualDub. I know that the lower Q-level given in the biterate viewer suppose to mean that the quality is better, but when I compared the clips side-by-side, I could not tell the difference. My question is how accurate is the Q-level figure, i.e. is this figure a better way of measuring quality level, or is my old trusty eyes a better measure of quality? Any explaination of how the Q-level is derived is appreciated.

Thanks in advance.