Sitting here trying to figure out what bitrate would be good to use when encoding with TMPGENC from my vhs captures to MPEG2. I do not want to lose too much quality on it but since I dunno what bitrate would work good with VHS captures I dunno what to do. Am using pal dvd resolution for the files, hoping it would let me burn at least 2 hours of film on it (but not really nescisary).
Anyone familiar with bitrates out there?
Have tried the bitrate calculator too.
+ Reply to Thread
Results 1 to 30 of 36
The bitrate calculator will tell you what bitrate for what space. It can't tell you about quality. Quality is really down to you. Simplest solution - calculate the bitrate for a selection of running time (eg. 1 hour, 1.5 hours and 2 hours) Encode a 5 minute sample at each bitrate, then author and watch them back. See what works for you.Read my blog here.
Remember that with VHS source you can use half-D1 resolution (352x576) without loosing much detail and the lower resolution will allow a lower bitrate at a similar quality.
As ronnylov said, you can get very acceptable quality with D2(Half D1) frame size. A bitrate of 2MB/s is good enough - I do it all the time.
In any case, since VHS is equivalent to 352x288 so even if you capture in MPEG1 at cif size (352x288) at 2MB/s that is quite okay.
With this bitrate you would be able to fit almost 4.5Hours onto a 4.38GB DVD-R
Increasing the bitrate further would not improve the picture since your source video is VHS - as I have found practically.
Originally Posted by ark
That bitrate for mpeg1 would be too high for compliant dvd -- you would have to drop it down to 1856kbps
The DVD resolution known as Half D1 (352x576 for PAL or 352x480 for NTSC) is a great choice for a source like VHS but the lessor resolution mentioned above (352x288 for PAL or 352x240 for NTSC) is a very low resolution for VHS and quality will definately suffer.
The "standard" DVD resolution is called Full D1 and is 720x576 for PAL or 720x480 for NTSC. The width is normally 720 but sometimes 704 is used and that is OK although it is more standard to use 720 for the width.
The Full D1 resolution can be good for a VHS source and generally speaking can look "sharper" than Half D1 but sometimes the Full D1 resolution will have more "compression artifacts" especially if the running time is "long" or there is a lot of constant motion through out the video.
The choice can be tricky sometimes which way to go but it ties into the bitrate issue big time which is why people are bringing it up.
The best quality wise would be Full D1 resolution with a CBR (Constant BitRate) of 8000kbps for the video. This would allow for approximately 60 minutes per DVD recordable.
If you use Half D1 resolution the BEST bitrate would be a CBR (Constant BitRate) of 5000kbps for the video. This would allow for almost 120 minutes per DVD recordable.
Those are BEST bitrates but of course you can go lower and still get good results.
It's all trial and error.
- John "FulciLives" Coleman
You will need to use a good bitrate calculator. One of the best is here ---> CLICK HERE"The eyes are the first thing that you have to destroy ... because they have seen too many bad things" - Lucio Fulci
EXPLORE THE FILMS OF LUCIO FULCI - THE MAESTRO OF GORE
Originally Posted by ark
I have tested this myself by capturing from miniDV and compared to a
capture of VHS recorded from the miniDV camera. The VHS capture
had full vertical resolution compared with the original.
And if you think about it, VHS can be interlaced so you will need
the full vertical resolution to keep the interlacing.
But if you are going to use 2 mbit/s then 352x288 may look better because this bitrate may not be enough with 352x576 resolution from a VHS source.
I was not aware that VHS is 320 x 576 for PAL.
What I read few years ago,when I started on digital video, is that VCD (352x 288 for PAL, 352 x 240 for NTSC) standard was made to give slightly better picture quality than VHS.
SVCD is the one which was a later improvement using MPG2 + VBR to give better picture quality and used a larger frame ( 576 x 576 for PAL and 576 x 480 for NTSC).
These figures were from the consortium that established the VCD, SVCD standards (includes big names like Phillips, Sony, Matsushita etc etc).
Did I get the wrong idea allthese years ago??
There is a VERY important distinction between the final resolution the video is displayed at, and the resolution it is CAPTURED at. Recommend capture at full resolution and resize in software, IF that is what you end up doing. Most capture cards will seriously degrade the image if captured at the lower resolutions.
Try both and test, I have, and there is no question whatsoever which is better. Many others have repeated these tests with the same results.
Then there are those who will simply quote formulas, without any reference to what actually happens in the real world when you attempt to achieve these results. The card does a poor quality resize from a FIXED, built-in, unchangeable, hardware-dictated capture resolution. The idea is to capture as close to this resolution as possible to avoid the resize, and then to perform the resize later in software using better-quality tools.
As for bitrate, use a calc and fill the disk. You must visually evaluate and see if the quality is acceptable. There is no standard, correct answer.
Very old subject...
In short terms, we capture the higher we can, then filter, resize, frameserve and encode.
The best "overall" solution, is to encode a well restored (filtered...) VHS tape to 1/2 D1 (352 x 576/480) with an average bitrate of 3100.
PAL users with excellent DVD players (very rare....) can also encode a well restored VHS tape to interlace 1/4 D1 (352 x 288). This is possible with PAL, but impossible with NTSC (the interlace barrier is the 280 lines). With an average bitrate of 1700kb/s you have excellent picture that way. The problem is only few (very few...) standalones support the correct way this framesize. Most shows that "resolution" very bad...
For the same reason (bad standalone players), 704 x 576/480 shows better 352 x 576/480. It is an overkill, it is a waste, it is slow to encode but.... For most DVD standalones, this is the only way a VHS tape shows for sure the same on DVD.
The bottom line is: Get a good DVD standalone with excellent built in mpeg 2 decoders and use 352 x 576/480 with an average bitrate of 3100 OR encode to 704 x 576/480 with an averate bitrate of 7000kb/s so to be sure that your DVD shows like the VHS source on any DVD standalone player
For what it's worth... I captured about 30 Beta tapes, plus some VHS tapes using 352x480 at a bitrate of 3500 and they look as good as the orginal tapes. I basically followed Lordsmurfs suggestions, as suggested above. Do that and you really can't go wrong.
Yes, you sure as hell can, as many of his guides were apparently written before he became completely aware of just exactly how an ATI card, and almost all other capture cards, actually work.
I have detailed these errors in a previous post, and he now is apparently aware of the fixed capture resolution issue. But he hasn't changed the capture instructions, SFAIK.
Cap at maximum bitrate and resolution, re-encode and resize in software. OR, cap at full D1 and set bitrate to approximate filesize. Bitrate calc only marginally useful as ATI and most other cap cards do not strictly adhere to specified bitrate.
Or, if you like a very low-quality resize performed on your clips which WILL BE CAPTURED at approximately 672x448 (this varies by card, but not by much), AND if, on some cards, you prefer a low-quality smoothing filter to be applied, then set your capture to 1/2 D1.
Anything can look OK or even good until you see what can be achieved in terms of better quality results. I thought SVCD was great till I saw DVD, then thought that was the best till I used Component output, then a true HD tv, then real HD source. There is always something better feasible, the goal should be the best that it can possibly be, within reason.
Don't forget to adjust your color, brightness, and tint controls by testing output ON THE FINAL OUTPUT DEVICE, adjusting Gamma control as suggested by Smurf is completely worthless and a total waste of time.
Originally Posted by ark
VHS is not equivalent to this low resolution. VHS is way above 352x288 resolution. It is a tad lower than 352x576, so that is the resolution you want to go for.
Most people used to say 352x240 because VHS is 240 lines. But the problem was people were BOTH on the wrong axis (x240 instead of 352x) and they were mixing analog and digital terms together. Resolution of video is defined along the first axis, 352x (or 240 "lines" for VHS in analog terms). And then all traditional video is ~x480 on the last axis, including VHS and even DV. But to get your VHS res of 250-300x480 (or x576 PAL), you'll have to use the nearest equivalent, which is 352x480 (352x576 PAL).
Let's keep the math correct, and let that old false information die away into obscurity.
Originally Posted by Nelson37
If anybody here is wrong, it's you. ATI cards capture video very close to 704x480, it's not a cheap piece of junk BT or Conexant that does fluky resolutions and plays around with garbage resizers. Cards very much vary in how they capture. This is why ATI cards perform so well at 352x480, is because the resize is only about ~50% along a single axis. The Half D1 is kept amazingly sharp, unlike most of cards (even the much-loved Hauppauge PVR cards, which are severely soft-focus by comparison).
The gamma control issue is related to certain system, using certain combinations of CATALYST, with certain combinations of ATI MMC. To date, there has not been enough information to find a definite cause. The advice of "change CATALYST, change ATI MMC" is what usually works, sometimes even reinstalling Windows or dual-booting into another version of Windows. Newer CATALYST and newer ATI MMC tend to be the problem. It's on one of my systems too now, from an update I had to make when I started to use Windows XP Pro (the issue did not exist when using Windows ME or Windows 2000). It might even be related to DirectX. So many variables, so little data.
An external proc amp works better than virtual controls anyway, so I've not cared to solve my problem.
"Cap at maximum bitrate and resolution, re-encode and resize in software" is NEWBIE ADVICE for people that do not understand the concepts of video, nor the advanced aspects of the equipment in their hands. This is more bad advice that needs to simply die and fade into obscurity.
Satstorm said something similar, but his advice is fine, as he discusses filtering. For restoration (not mere conversion), capturing higher may or may not help, it's simply a preference. Software resize may help remove noise, that is one side effect of software downsize.
This hobby is far too complex to make simple statements that are broad sweeping over what "all device" can or cannot do. Each device must be taken separate, mixed with knowledge of the task to be performed, and then taken with an understanding of analog video concepts. Then, and only then, can you arrive at solutions on what to do. And you'll continue to learn along the way.
If you are going to quote me, do it correctly. I have no idea what you are talking about with the "Gamma issue", I was referring to the Proc Amp controls built in to MMC, and to your specific reply to a post I made containing identical advice to what I just suggested. The Gamma control affects PC output only, the color and brightness adjustments need to be made to match captured output to source, for the final playback device. ATI caps typically too dark, colors faded, and very slightly greenish. Many folks do not care to spend a couple hundred for hardware to do this.
As to the resize, a hardware resize is still a hardware resize, if you want it resized, do it in software where you have control. The point is the capture resolution is FIXED, there is no advantage to capping at a lower resolution if maximum quality is your goal. Do both, examine the frames carefully. Lower res caps are significantly degraded.
IF you are going to re-encode, filters are of course optional, but start with the very best you can get, which would be max bitrate and res, or AVI if you can do it with no frame drops.
As for sweeping statements, which part about "varies by card" and "on some cards" did you fail to understand? If you know of ANY card which does not use a fixed capture resolution, precisely which one would that be? It sure as hell isn't ATI. And how many years did it take to get you to understand this issue, instead of just blindly quoting VHS specs which really do not matter, because of the fixed capture resolution?
Your information on the ATI MMC proc amp is flawed. It works quite well, though not as good as a hardware proc amp (like broadcast-grade Elite Video BVP-4 Plus or SignVideo PA-100). The gamma control on MMC is supposed to affect video quality like all other sliders on the virtual proc amp, but sometimes it does not, and is the fluke I refer to. It is random at the moment, hard to pinpoint why it happens.
Your assumption that a hardware resize is inferior to software resize is fundamentally flawed. Such broad statements cannot be made. Each resizer must rely on its own merits, in terms of output quality. For example, BT and Conexant resizing is terrible, especially if you use certain drivers. ATI Theatre chips, on the other hand, do quite well at normal DVD-spec resolutions.
If you think the specs of the source (like VHS tapes) do not matter, you are a complete video novice and are destined to make many mistakes. Understanding your source is probably one of the most valuable bits of knowledge you can have, when converting analog to digital. This crap of "capture highest" is an amateur mistake. This is very similar to people who think higher megapixels (MP) means the camera is beter.
Anyway it is a good idea to fine tune contrast, brightness and color saturation because it varies from different tapes and different equipment. I adjust the settings in the driver for my TV-card and it really can make some difference. A too dark or too light capture does not look good and it is often useful to decrease the colour saturation a little bit when capturing VHS.
The histogram function in VirtualVCR or in Virtualdub (or in any video editing software) can be uselful to fine tune the settings. Adjust, record and save the settings (in VirtualVCR the settings can be saved to a file). Analyze the results and fine tune again until it is OK. This is a good thing when capturing from tape, you can do it again and again until you are happy with the results. And don't forget that it should look OK on your final target. I mean if you are going to make a DVD to watch on the standalone player on a TV, then check your final results on the TV.
Ronny, you are absolutely correct on this point. The expert advice of, and I quote, "learn to use your gamma control", is the worthless advice I was referring to. Brightness, color saturation, and tint need to be adjusted to match final output device. Gamma adjustments affect PC display only, not the capture. This is the way it is supposed to work, and always has, through 3 cards, 5 or more MMC versions and driver versions, no fluke I know of.
Is it possible for a hardware resize to be superior, or equal to, a software resize? Absolutely. Is it likely? Not in my experience, which runs over 5 years now and several hundred seperate captures. How many users here have tried multiple resize methods and settings, with different preferences for different situations? How many would accept having ABSOLUTELY NO CONTROL over the resize method and setting used? How much money would you like to put on the line that I, or anyone else, can find a software method and setting that will beat what the ATI will do?
How many "experts" were telling people to cap VHS at 352x480 because that matches the res of VHS tape? How many times was oversampling, Nyquist theorem, and other BS discussed, and used as "proof" that capping VHS at 1/2 D1 was best? For how many YEARS was the FACT ignored that this info, while basically correct, DOES NOT MATTER because you CAN NOT cap at 352x480 with an ATI, or most other cards, because of the fixed resolution? You cap at a higher resolution AND THEN RESIZE, whether you want to or not, because that is how the card works.
You do not get to choose the capture resolution. All you get to choose is the resolution you resize to. Next choice is do you let the hardware resize, or do it in software where you have control over methods and settings. Most with any experience are aware that different resizing methods can have advantages in different situations, there is NO "one size fits all". Yet if you cap at a specified 352x480, with an ATI or most any other card, that is exactly what you get. One method. No choice. No control.
Many people have posted that the full res cap just looks better. I have tested this, on several tapes, old and new, frame by frame, blown up 2-4x, on the PC and on the TV. I have annoyed the hell out of the wife, kids, and several others doing multiple A-B tests and comparisons, "which looks better" and "Why". The difference is clear and obvious. The full-res cap looks better. It is less distorted, with more detail and sharper edges. The degradation increases significantly when you drop from 480x480 to 352x480. Some cards actually activate a smoothing filter, which turns on somwhere inside that range and for everything lower. SFAIK, ATI does not add a filter, but it definitely resizes, and poorly at that. I could care less who, and for what reason, disagrees with this. What does annoy me is that when people are given completely invalid reasons why it "theoretically" should be capped at 1/2 D1.
I cannot understand why "cap high and resize later, with the best available method" is called bad advice, when "cap high and let a non-controllable hardware resize take place" is considered a better way.
What I do is cap high and leave it that way, no resize at all. With proper settings and good source, real-time MPG is hard to beat.
Originally Posted by Nelson37
This comes into play especially when somebody's "tests" contradict commonly accepted knowledge.
LS, what a lovely way of saying "there's a reason that double blind A/B tests are the standard".
I like how (as usual) some people are saying their way is the best way. As ever, it depends on the user's needs. Speed or ease of capturing? Restoration/filtering ability? Highest quality regardless of size? "Best" quality/size compromise?
..Or even the dreaded "most time per DVD regardless of quality"...
I am well versed in how to do a proper double-blind test. Note "which looks better", not "does this one look better than that". I do not do things like have the observer watch me insert a DVD disk and then ask "does this look like a DVD". Observers had no knowledge of creation methods, they were simply another pair of eyes. Disks were actually labeled A and B, after shuffling, so I did not know which was which. In some cases they observed no difference until image was zoomed. All who observed any difference, which were approximately 4 out of 5, selected the hi-res capture.
I note the complete lack of response to the factual and technical reasons for the hi-res capture. Name-calling is the last resort of those with no valid argument, particularly when they no longer have the ability to lock threads where someone disagrees with them.
I am offering not just my opinion, but also real, factual, technical reasons to explain that the reasons given for using a 1/2 D1 cap of VHS tape are simply not valid. If you think it looks better, or is easier, or just prefer doing it that way, fine. If you have been told it is better because the cap resolution matches the source resolution, you have been given inaccurate information.
I am not interested in anyone's opinion of my attitude, I am not here to win a popularity contest. Nobody here puts food on my table. I am interesting in improving my output in this hobby, and helping others to do the same, as I was helped in the past.
As I have repeatedly stated, try it yourself and decide which is visually better. The only "commonly accepted knowledge" on this issue derives from theoretical information which is not valid due to the way the hardware actually works.
Fundamental sampling theory identifies 352 to be very near the absolute minimum number of samples per line required to preserve the detail in a 240 line source (VHS quality), and that assumes ideal theoretical filtering. Practical implementations of anti-aliasing and reconstruction filters (analog or digital) easily push this requirement out beyond 352, and there are many on this forum who have done A/B tests and notice the difference between 720 and 352 sampling. As always, much of this is source/setup quality dependent and YMMV.
Others may have valid reasons for sampling at 352, but guaranteeing the preservation of source detail is not one of them.
davideck has a point here and no equipment or method is absolutely perfect. So if 352 is OK in theory the capturing hardware may not be able to get the full VHS resolution because of internal anitaliasing filters and stuff. The same thing can be said about software resizing, you will loose some quality when downsizing from higher resolutions.
Then we also have the aspect of MPEG-2 compression. High frequencies (small details) may be lost by the compression method. Sometimes this is good (some unwanted noise is removed) and sometimes it is bad (details are lost). The same thing can be said about resizing. You loose some details by downscaling to a lower resolution but you also loose some noise.
With my capturing card (Terratec Cinergy 400) I actually get as good quality capturing directly to 352x576 compared to capture 704x576 and resize to 352x576 afterwards. I guess my capture card has a good internal resampler (a Philips SAA 7134 chip is used).
If I am allowed to link to another web site I want you to take a look at following page:
In section 5.3.1 different capture card scalers are compared. But read the whole guide, it is really interesting.
Encoding to 352x288 is *incredibly* bad advice. VHS does not 'sometimes' have interlacing. It is always interlaced and the interlacing is completely lost by encoding at 352x288. That will give a totally different look to any content that is video/studio based rather than film.
Personal preference for me is:
For speed a good VCR (JVC) a TBC and a good DVD Recorder Pioneer 531h which yields reasonable results for me after adjusting the video in the 531h to my satisfaction.
For bad source that needs TLC and lots of fixing up Good VCR, TBC, ADVC 100 to computer as AVI @ 13Gb an hour and go from there.
Lots of conflicts in this thread. So it's agreed that VHS at 352x480 on DVD with a generous bitrate in the 3000-4000 kbps range, will result in minimal to no quality loss from the original?
My question is, how compatable is half D1? Someone was saying very few players will play it well. Is this what they were talking about? Is there really any reason why I wouldn't want to use this resolution?
NTSC is the United States TV standard.
1/2 D1 353 by 480 (NTSC) is a standard spec that any dvd player should be able to play or it shouldn't be marked DVD player. It should also be able to play 352 by 240 (NTSC for example) with 48Khz audio as that is also standard.
The above apply to player sold in the USA
I don't think that was the outcome. Using the analog-digital passthru feature of my Canon DV camcorder for capture, I personally encoded all of my VHS tapes using 720x480 in Vegas after comparing 720x480 to 352x480 outputs. Far, far superior. I would never use 352x480.
Just my personal experience!
MarkMatters of great concern should be taken lightly.
Matters of small concern should be taken seriously.
I guess I'll just have to experiment and see how it looks between 720 and 352. There seems to be a lot of differing opinions between the two. I hope it ends up being that 352 looks best for me, because the VHS tapes I have to convert are recorded in SLP mode. So, using 720x480, that's atleast 3 discs per tape.
Also, about the whole 480 vs 240 thing. 480 is interlaced and 240 is progressive, right? So, they're the same thing, except the 240 is worse quality because it has to be displayed like it's 480 lines? (only applying to when played back interlaced, like on a TV)