I just got a new ADVC-55 and started playing around with my old VHS tapes. And noticed a couple things right away.
1. The hue in preview or captured video is shifted towards the green. This is very noticeable, especially when compared to the raw video playback into the same monitor that is used during capture. Live video looks fine. Captured or previewed video is greenish. I can correct it in Vegas and I have a proc amp on the way that can also take care of it but I am puzzled why it is happening in the first place.
2. The audio warbles at about a 10Hz rate during capture (sounds like mostly amplitude but possibly some pitch instability too) but the captured audio is fine as far as I can tell. I have also noticed pitch shifts. Not sure if they are confined to preview only or the pitch is shifted on the capture.... I haven't looked into it yet. Any chance this is a FireWire thing? I'm using a PCI card with two FW400 ports on it. No other FW devices are plugged in.
I'm using the capture application in Vegas Pro. I've tried VirtualDub and the video is green shifted there as well.
+ Reply to Thread
Results 1 to 29 of 29
Last edited by pgoelz; 10th Mar 2010 at 19:07.
I have the dip switches in the default position.... NTSC and black = 7.5 IRE. Tried other settings and there was no change.
It isn't a hue issue so much as a color balance issue. I can correct it by reducing green in Vegas but not by rotating the hue.
Note that I have never seen this color shift on any monitor I have ever used including the one that is displaying the raw video. It is a combination video monitor / TV and PC monitor. The raw video out of the AG-1970 looks perfect. It is green shifted when passed through the ADVC-55 using both composite and S-Video inputs and displayed through Vegas or the Vegas capture app on the same monitor.
I just had a careful listen and the warble is definitely NOT present in the captured audio track.... it is only there when monitoring during capture.
Sure, I'll do that this evening. I'll also post a couple frames of the corrected version. It is as if the green channel is maybe 10-12 percent higher than the others. It is possible that is the way it looks as recorded, but it has never looked that way on any TV or monitor I have used so far when simply played back.
I had a look at a capture with the video scopes in Vegas Pro and it does looks somewhat skewed to the green. It also looked like the overall level was not quite right. Again, it looks fine on a monitor straight out of the VCR, but is it possible that the VCR is not adjusted quite right? Also, this was recorded indoors with a 1984 vintage consumer grade camera and I don't recall how (if) it did white balance. Maybe it really does look like that?
When I get the chance I'll have a look at the raw video out on an oscilloscope and see what the levels are. My proc amp will also be here today.... we'll see how that looks.
I'm curious to look at the DV-AVI frames on the Vegas Scopes. It may give some clues.
If the analog output from the VCR looks OK on the TV, it isn't clear where the green skew is coming from.
I use Vegas Pro and have looked at the video on the vectorscope. It is difficult to determine on regular video but in the camera shot is a fair amount of background that I believe is basically white. On the vectorscope it appears as though the bulk of the center of the display is shifted towards green. Shifting it back towards center with one of the color balance tools in Vegas mostly resolves the green shift.
I'll post a frame of video this evening and you can have a look at it.
I am also attempting to capture some old tapes that are dubs of 3/4" U-matic recordings. In low light shots they are also green shifted when captured (they are fine on the monitor) but there are some daylight shots that look fine. It is as though the green channel response was non-linear.....
Sorry, no AVI frame tonight. I got to playing around with my setup with and without my newly acquired Elite Pro proc amp and comparing the ADVC-55 captures to some I did of the same source with my Dazzle DVC-150. And you know what..... the Dazzle looks better. No proc amp, just straight from the AG-1970 to the Dazzle.
Captures with the ADVC-55 have the red channel shifted up a bit, like an old color TV with a convergence problem. This doesn't show with the Dazzle. And colors seem truer with the Dazzle.... there is still that green tint but in all honesty, it is there on a monitor when I look closely.... it's just the the ADVC seems to make it worse.
This surprised me. I hate the Dazzle (it's XP only, for one) and Pinnacle in general, but dang it, captures look pretty good and are WAY smaller.
There are also synch issues with the ADVC-55..... the video wobbles back and forth if I use the Panasonic AG-1970 TBC. It is rock solid with the Dazzle.
The Elite Pro is nice but unless I can figure a way to use the video scopes in Vegas during capture, it is pretty useless since setting levels without any sort of indicator is a lengthy iterative process.
So I'm burned out for the night. Maybe over the weekend I'll set both up and do some more careful A/B comparisons. But right now I'm not liking the ADVC-55 very much.
It almost sounds like a hardware fault.
I know sometimes a cheaper VCR works better than a decent one for a particular problem tape - but I've never heard anyone have this experience with Canopus vs Dazzle!
Since the analogue video, and the Canpus device itself, both work exclusively in YUV space, it would be quite hard for it to create an error that was only fixable in RGB space.
So is the DV codec "broken" on your PC? Then again, you see the same problem when looping through the ADVC55, so it must be a hardware issue. Very strange.
The Canopus devices do AGC the input video to correct levels - different devices handle incorrect levels in different ways. It may be a happy coincidence that your video looks OK on a TV - a case of two wrongs making a right - and the Canopus partly "corrects" it, making a right and a wrong, giving a visibly incorrect result. Or it could just be broken!
First of all, thanks for taking the time to answer in some detail. I have a pretty good understanding of video and NTSC but not necessarily the intracacies of getting it from VHS to digital.... without test equipment.
For clarity, let me lay out what I am doing and then I'll provide the results of some testing I did this morning.
I am digitizing some old VHS tapes. The one with the green tint is camera original that I shot in 1984 with a newvicon consumer grade camera and the one that I have been working on yesterday and today is a dub from what I suspect was originally 3/4" U-Matic shot with a professional three tube minicam in 1984. It has some visible mis-convergence at the edges that complicate the evaluation.
Let's leave the green tint issue for a while. I suspect the green tint is actually (mildly) there in the original but the ADVC-55 is making it more noticeable because it is not getting the setup and/or the levels right.
This morning I captured some test footage from the dubbed 1984 video. This particular clip is a difficult one, shot past dusk with some artificial light. It is somewhat dark and noisy on my monitor straight out of the VCR. Unfortunately I have no way to monitor the actual levels prior to capture. I captured this test footage three times. Once with the ADVC-55 set to 7.5 ire, once with it set to 0 ire, and once with the Dazzle set to all default. I then looked at each capture on the Sony video scope / histogram. What I found was that the Dazzle seemed to AGC the video and the resulting capture looked roughly the same as the raw video did on my monitor. The ADVC-55 did NOT seem to AGC the levels and the test captures were dark. The one captured at ire = 0 looked better, with at least some detail in the dark areas. The one captured with ire = 7.5 looked very dark with no detail in the dark areas. Histograms confirmed all this.
I then used the Sony color correction tool to adjust the gain / offset so the histograms of the two ADVC captures matched the Dazzle capture. At that point, what I found was that the Dazzle and the ADVC 7.5 ire captures looked quite similar, with the ADVC capture looking slightly sharper (lack of compression?). The ADVC ire = 0 capture had washed out blacks that were not correctable with the offset function of the color corrector. I'm thinking that the ADVC didn't capture all the black range with ire = 0.
The reds were still slightly visibly shifted up but I can see vestiges of this in the live video if I know what to look for. This may be my AG-1970. A previous capture with a cheap Phillips machine didn't show this at all. I'm still investigating.
Using my proc amp seemed to just complicate things and might have magnified the color shift upwards.
So..... I think for now I'll capture with the ADVC-55 set to 7.5 ire and omit the proc amp. Since I don't have any way to validate my proc amp settings prior to capture and because my captures are about 2 hours each, I think it makes more sense to do level and color correcting post-capture and use the video scopes in Vegas?
Whew. Nothing's easy with video, eh? At least I am having absolutely ZERO issues with Vegas. I came to Vegas from Premiere Elements, which was almost as bad as my Pinnacle experiences.
Capturing 7.5 IRE vs capturing 0 IRE: Capturing at 0 IRE keeps a greater range of the original, squashed into the same range in the digital domain.
Whatever the Canopus device thinks is "black" goes to digital 16.
Whatever the Canopus device thinks is "white" goes to digital 235.
The Canopus device will assume that:
blanking is at 0 IRE
sync tip is at -40 IRE
"white" is at 100 IRE, and
"black" is at either 7.5 IRE or 0 IRE, depending on the position of the switch.
Hence setting the switch to 0IRE captures an extra 7.5% of the range. Whether this is "correct" or not depends on where black is in the original analogue video signal.
How does it map actual voltages to IRE? It finds and measures the voltage of sync tip and blanking, locks to these, and scales everything else proportionally. So simply trimming the bottom off the sync pulses (for example) would actually increase the brightness of the video captured by the ADVC. Whereas increasing the voltage of everything in proportion wouldn't make a jot of difference to the captured video.
This is from my own tests on an ADVC110. I assume the ADVC55 is the same.
Thanks. I'm finding it very frustrating not being able to examine the incoming video. That said, I do have a decent general purpose scope in the basement. I can bring it up and see what it says about the video coming from the AG-1970. It isn't calibrated in IRE but I can translate from voltage levels. I will have to re-aquaint myself with the terms though....
However, it sounds like the ADVC is pretty automatic and should capture correctly provided the sync and blanking from the recorder is correct. Do you know if the recorder generates these or did they come from the camera (and are therefore a variable from one camera to the next)?
What you said exlpains why the video was washed out in dark areas when captured with IRE = 0.... I think. Since we're talking about analog video, I assume it was intended to have black = 7.5 IRE, right? Same as broadcast? So everything else being equal, the 7.5 setting should be the best match. And it seemed to be in my testing.
Any idea if there is a way to use the video scopes in Vegas Pro to examine live video? That would be super....
EDIT: Just had a look for waveform terminology and refreshed myself. If I am looking at a scope calibrated in volts instead of IRE, I assume that it is 1V from blanking to max video? And that would put the sync tip at 400mV below blanking? Or, with sync tip at 0, blanking would be +400mV and max video would be +1.4V relative to the sync tip?
Last edited by pgoelz; 12th Mar 2010 at 09:39.
For US NTSC:
sync = -40 IRE = -0.286V
blanking = 0 IRE = 0V
black = 7.5IRE = 0.054V
white = 100 IRE = 0.714V
(to three decimal places!)
Japanese NTSC has black = blanking = 0 IRE = 0V. Some camcorders use this in or out, even in the USA.
Unless you send the video through a TBC, the syncs you receive are those recorded on tape by the original camcorder - they may have their level "corrected" or clamped by the replay VCR, but usually not AFAIK. Those syncs recorded by the original camcorder could be way out or spot on.
A proc amp and TBC should be able to get exactly what you want. Or, as long as you don't lose parts of the range when capturing, do what you said: fix it in Vegas!
Last edited by 2Bdecided; 12th Mar 2010 at 10:23.
Thanks. I'll do some experimenting over the weekend. And that's what I thought.... as long as my video capture doesn't clip, the only thing I lose with a dark capture is bit depth..... ie., I squash the range of black to "white" into a smaller number of steps. And conversely, the only thing I gain by using the proc amp is bit depth. The other things the proc amp can do can be done equally well post-capture. ??
I'm a fan of using as few devices in the signal chain as possible so I'm thinking that the proc amp might have been an unnecessary purchase. But I got it (Elite BVP4 plus) for $45 on Ebay so I'm not complaining.
I do wonder why the ADVC-55 has trouble when the AG-1970 TBC is on though. My monitor and my Dazzle have no trouble at all and the picture is rock solid (with TBC on or off, actually). The ADVC-55 frame wanders back and forth with the TBC on. I have noticed other oddities too. For example, the pitch of the preview audio shifts at times. Doing something like changing the TBC gain or switching the TBC on and off usually cause it to snap back to the correct pitch. I think this is related to the warbling preview audio I reported earlier. The captured audio seems OK from what I can tell.
You're better of using an analog proc amp to adjust colors and levels before they're digitized because that gives you an infinite number of steps between the extremes. Once the signal is digitized you've only got 8 bits to work with, 256 steps at most.
Agreed. Trouble is that without a waveform monitor I am groping in the dark. The one tape I have trouble with at the moment has many clips one after the other, each with a different video level. It was shot outdoors as the sun set back in 1984 with a three tube pro shoulder cam and there wasn't enough light. So unless I ride the gain for two hours and accept that I will lose the beginnings and ends of the clips as I change the gain, I think that overall the best thing is to capture as-is and fix it later. The corrected video looks pretty good considering the source. But I have that proc amp staring at me so I might change my mind
Is there a capture application that has some sort of live level display or histogram? That would be ideal. Interestingly, I'm finding that the capture applications out there are surprisingly clunky and feature-poor. I like VirtualDub, but I find it to be quirky and unstable. At the moment I am using iuVCR which at least has a timer to end an unattended capture.
I'm still puzzled by the warbling audio while monitoring a capture. This seems to be a playback issue since the capture at least plays fine and I don't drop frames. But it has me worried.
I just realized that I have probably been over thinking the waveform monitor thing. I assume that if I monitor the video capture, I can adjust the video level with my proc amp and be assured that if the whites don't clip I am not exceeding 100 IRE (level 255)? Unlike an analog system where whites just get.... whiter.
IRE 100 equates to luma 235. IRE 0 equates to luma 16. So there's a little head and tail room available. When converted to RGB for display the Y=16-235 range is usually expanded to RGB=0-255 to better match computer monitors. (Although I think Vegas defaults to not expanding the range for display.) You want your final product to have Y between 16 and 235.
And just when I was thinking I had it under control How the heck do you get it really right? The Vegas scopes are calibrated 0-255 with no indication that black and white should be anything other than 0 and 255. It does have the ability to enforce broadcast levels but does so (I think) by clipping, not by gain adjustment.
Yeah, there is indeed (almost) enough noise to mask quantization issues I have also found that in careful A/B comparisons, the proc amp subtly degrades the video. It seems to change the gamma very slightly and it seems to add some very subtle ringing to edges. So I won't use it unless I have to. Like unless the video is less than maybe 50 IRE.
Now to look into noise reduction software.
I've been gone a few days so I'm just catching up. The problem as I see it is you are chasing multiple variables and need to separate them.
I agree that Vegas' lack of live scope input monitoring is a pain. You have to capture samples then monitor playback. Assuming the ADVC-55 isn't malfunctioning, it will capture what it is sent. It will not AGC which is very good. It is near imposible to manually correct an AGC.
First step is to calibrate the VCR with a known good tape (ideally a reference tape used in service shops). First playback levels are set to the tape*, then a reference color bar is used to set the record path. Ideally you would have a color bar source to calibrate the path. The ADVC-100/110/300 will output analog color bars so you can calibrate your VCR.
You must have the Vegas scopes set for "Studio RGB" like this to get 16-235 reference.
The resulting waveform/vector display for a typical consumer VHS VCR playback looks like this using the recorded ADVC-100 color bar reference.
In the absence of a service tape or color bar, you can measure approximate playback levels from a quality commercial VHS tape but it has to be one without Macrovision. Good candidates are promo tapes from a good post house or quality advertising tapes. Compare various tape playbacks.
Only after you understand the calibration of your VCR with a "good" tape can you analyze a poor tape.
First connect the VCR direct to the ADVC and take a sample. If you add proc amps and/or TBC, these should be calibrated to a color bar as well.
* VCR luma output level is usually an internal adjustment. I and Q (UV) gain and phase get the color bar dots in their boxes.
Last edited by edDV; 16th Mar 2010 at 13:00.
From a composite input, most TVs don't do AGC luma much, or at all - so a TV and an ADVC can give very different results when the sync pulses are non-standard. I'm not saying this is the problem in this case, just that it might be. Or one of them!
* unless a TBC was used during that dub.
AviSynth. Construct a capture graph with GraphEdit and use an AviSynth script to display a Histogram. Then use a media player to "play" the script.
DirectShowSource("Capture.GRF", audio=false, fps=29.97, framecount=99999)
Last edited by jagabo; 16th Mar 2010 at 16:17.
Thanks guys, you are REALLY helping me sort out all the variables and get a handle on things. Yes, I have been trying to cope with too many variables. I think the bottom line for me is to capture with only the ADVC in the chain unless I absolutely have to boost the luminance beyond maybe 20%. The calibration and whatnot may not be that important since this is a one way trip.... VHS to digital. I may never record on the AG-1970 and playback is what it is. Unless I decide to tear into it and check the various adjustments against an alignment tape, it too is what it is. And I think it is pretty close to correct since most of my tapes play back correctly. I can fix the rest with the proc amp and/or the adjustments in Vegas. A waveform monitor during capture would be really helpful but I guess if I make sure the whites aren't clipping the regular capture monitor (I'm using iuVCS at present) will suffice.
I'll have a look at AviSynth. But GraphEdit? I have yet to make any sense of that program. But it looks like the waveform monitors in the Enotech DV Processor might also work if I can figure out how to use it to pass video on to the actual capture device.
Oh, and a question..... what is the norm for video that will be burned to DVD? 0-100 ire or 7.5-100?
Last edited by pgoelz; 16th Mar 2010 at 16:48.
Vegas to Studio RGB as shown above, 16 will show as zero on the waveform monitor and 235 will be 100 (255 clipping happens around 107).
Eight bit 0-255 is used in graphics pre-production or for uncorrected computer display. Computer media players are mostly expecting 16-235.
*Ten bit 64-960 with 961-1023 overshoot is also widely used in production.
Last edited by edDV; 16th Mar 2010 at 17:16.