I am capturing old VHS and VHS-C tapes in order to convert them to DVD. I am in the US and am using all NTSC hardware. I am using the following equipment:
JVC SR-MV45 S-VHS/DVD combo VCR
Canopus ADVC-110
I am also using an old school crappy Sharp VC-A542 VCR for testing.
Here is my problem... I went through the hassle of capturing about 70 hours of old movies to my computer with the JVC and Canopus units. I set the Canopus unit to 7.5 IRE and captured away. The problem is that after burning my first movie I noticed the video was very dark. I attached the JVC unit to my LCD and compared the original VHS tape to the DVD and the DVD was much darker than the VHS tape (which looked good). I then did a straight VHS->DVD dub using only the JVC unit and the resulting DVD looked exactly like the original VHS tape when played back on my LCD. So something in my capture process was messing up the setup.
The capture->edit->burn process introduced a luma crush that I do not like.
Now here is the very strange part... I output the color bars from the Canopus unit to my JVC VCR and saved the bars to a VHS tape. I then played the colors bar back on both the JVC unit *and* an old school 8+ years old VCR that I had laying around and captured the output to my computer. Both captured clips showed the correct stepping for the colors bars. They were clamped around digital 16. However, when playing my old VHS tapes back on both units the old VCR does not exhibit crushed blacks when captured however the JVC does exhibit crushed blacks. I don't understand how this can happen. Here are a couple of screen caps:
Old VCR
New VCR
The scene I picked contains what I believe to be a pure black subject. The new VCR shows that subject is represented by digital 16 and the old VCR shows that subject is around digital 32. The crazy thing is that both VCRs result in that exact same output when displaying the color bars but completely different output when displaying my old VHS tapes. To be clear, the "Old VCR" image is the one that looks the best when displayed on my TV. The "New VCR" image looks extremely crushed.
I initially thought "well maybe the old VHS tapes were recorded at IRE 0", after all the original video recorder is about 20 years old. However I dug up the recorder and it claims to conform to NTSC standards. The other really strange part is both VCRs display the same image (luma-wise) when playing back both the color bars and the video clips when connected directly to the TV.
I am really at a loss here. All I want is a DVD that looks exactly like the source VHS tapes. The crushed blacks in my current DVDs are unacceptable. I just can't figure out where my setup is wrong. Any help would be appreciated. I can post pictures of the color bars if that would help.
Try StreamFab Downloader and download from Netflix, Amazon, Youtube! Or Try DVDFab and copy Blu-rays!
+ Reply to Thread
Results 1 to 20 of 20
Thread
-
-
I agree that oldvcrho6.png is showing an approximately correct black level and that newvcrgy9.png is showing crushed blacks.
Note the bit of left edge blanking showing ~7.5ire below the level 16 marker for oldvcrho6.png. Whites appear to go to ~255 which is typical of consumer camcorder source. Broadcast/cable source will show a few to no spikes above 235.
It might give a clue if you posted the actual playback frame caps (or a few frames of DV-AVI) for the recorded color bars for each machine.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
the entire color spectrum appears to have been shifted down in the new pic. are you sure it wasn't set to something weird like -7.5 by mistake?
--
"a lot of people are better dead" - prisoner KSC2-303 -
Originally Posted by aedipuss
I will post some short caps of the color bars from both VCRs in a bit.
Thanks for your help. -
Ok here are some short AVI (Huffy) clips of the colors bars out of the Canopus as well as a more recent VHS recording that was made.
http://rapidshare.de/files/39577800/clips.zip.html
Color bars clip
Description: Color bars output in 7.5IRE mode by the Canopus ADVC110.
Recorded: Canopus ADVC110 in Color Bar mode -> JVC VCR via composite for recording.
Captured: Using the new JVC VCR via S-Video and my old Sharp VCR via composite to the Canopus unit.
Result: It looks like the old VCR results in a black level that is 8 digital points higher than the new VCR. The old VCR's bottom-most black point is right at 16, however the new VCR has its bottom-most black point at 8.
TV clip
Description: A TV clip recorded off of broadcast cable around 2001.
Recorded: With a standard consumer VCR (I don't have it anymore).
Captured: Using the new JVC VCR via S-Video and my old Sharp VCR via composite to the Canopus unit.
Result: Similar to the color bars. The old VCR's levels are about 8 points higher than the new VCR.
Camcorder clip
Description: VHS camcorder footage from around 1987. Could be multigenerational.
Recorded: With an old Zenith camcorder that was purchased back in 1983.
Playback: On the new JVC VCR via S-Video and my old Sharp VCR via composite to the Canopus unit.
Result: The old VCR is a full 16 levels higher than the new VCR.
Questions:
1. Would an S-Video connection make this much of a difference when compared to a composite connection?
2. The JVC VCR is considered a "professional" grade deck so I would assume its output would be more "true" to the source however that is obviously not the case when capturing. The deck has probably seen 100-200 hours of playback, is that alot?
3. The really strange thing is that both VCRs look virtually identical when played back on an LCD screen even if I put the Canopus between them. This black level crushing only seems to occur when capturing to my PC. Are the Canopus units know to have black level problems?[/url] -
Here is another lead... When I disable the TBC/NR functionality on the JVC VCR the captured video is much better (brighter) and matches the source on the other old VCR. So it looks like the deck's TBC feature causes the luma to drop. The strange thing is that this drop only shows up in captured video as far as I can tell. When I hook the VCR up to an LCD TV enabling/disabling the TBC does not seem to affect the luma of the video.
-
Strange. Does the JVC TBC have proc amp controls ? Or black level adjust in the menus ?
Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Nope, there are only a few options in the menu and they are things like Digital R3 NR, Video Stabilization, etc. Nothing about black level.
Hmm, so would it be advisable to keep the TBC on and change the Canopus to 0IRE? That results in a very good picture. -
Originally Posted by binisterRecommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
In my experience devices usually output much brighter video via composite (compared to s-video). Or is it that the TVs display composite much brighter than s-video? In any case, whenever I have had a device with both outputs, viewing via composite was always much brighter than with s-video.
-
Originally Posted by jagaboRecommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by edDV
-
I don't think the JVC is the problem. If I attach it directly to my LCD TV the luma looks fine. I have narrowed the problem down to the use of the JVC's TBC in conjunction with the Canopus ADVC110.
Doing the following:
JVC (TBC enabled) -> Canopus (via composite) -> LCD TV (via composite)
I see the same drop in luma across the board. So for whatever reason the signal being generated by the JVC when TBC is enabled is causing a problem in the Canopus. I also noticed the picture quality of the tapes is degraded when going through the Canopus. Is this normal after an analog->digital->analog conversion? -
Originally Posted by binister
I think the JVC output level (internal adjustment) is low through the TBC path. That is what the waveforms show. I also think the TV is auto adjusting brightness back up so you don't notice the low blacks as much. This is what separates a broadcast monitor from a TV. The broadcast monitor is supposed to show the reality of the signal (warts and all) so that you can visually fix the problem. A TV trys to improve and conpensate problems through various processing features. It isn't possible to turn all these off in the TV menus. The VCR->Canopus ADVC-> DV capture path should not alter reality so long as it is switched to 7.5ire.
Originally Posted by binister
To fully compare the JVC to LCD TV path to the JVC->Canopus ADVC-> DV file path, you can't use the computer monitor. You need to play the DV file back through the Canopus ADVC to the same LCD TV.
DV file->Canopus ADVC-> LCD-TV (S-Video or Composite)
As said above, the LCD-TV is not a good instrument for measuring levels but the pictures should look similar to the direct connection VCR-> LCD-TV (S-Video or Composite).
Is this JVC VCR new? If so you should ask JVC support if black shift is normal when turing on the TBC. Also ask if this can be fixed under warranty.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Originally Posted by edDV
TBC ON: Significant drop in luma
TBC OFF: Correct luma
JVC->Canopus@7.5IRE->TV (Broadcast footage)
TBC ON: Correct luma
TBC OFF: Very very slight drop in luma but still looks good
JVC->TV (Old camcorder footage)
TBC ON: Correct luma
TBC OFF: Correct luma
JVC->TV (Broadcast footage)
TBC ON: Correct luma
TBC OFF: Very very slight drop in luma but still looks good
Originally Posted by edDV
Originally Posted by edDV
After more testing I think I have determined that my old camcorder tapes were recorded without any NA NTSC setup added. Almost all of the camcorder footage has significant bleeding into the 0-16 digital range when captured at 7.5IRE. The broadcast footage that also happened to be on some of the tapes does not suffer from the bleeding.
Since the camcorder footage was all shot on a circa-1982 Zenith camcorder I am going to assume either it didn't have its IRE switch set to 7.5 or it just recorded everything at 0 IRE by default. My solution is to bump up the luma in AviSynth by 16 points since I captured all of it @ 7.5 IRE.
One additional question I have... Many of these tapes were dubbed from VHS-C tapes to full VHS tapes using two old VCRs (1980s). Would it be possible for the IRE levels to get changed during that dubbing process if any IRE switches on the VCRs weren't set correctly?
Thanks again for your help. I called around to some local video shops to see if anyone had an analog waveform monitor that I could use to check the IRE levels on these camcorder tapes but nobody in a 25 mile radius even own a single piece of analog equipment! -
Complex issues but I think you have all the clues in there. I'll get back to this later today.
FYI: Consumer DV camcorders output at 0 IRE analog and analog pass through 7.5 IRE to digital level 32. Watch this JVC black level tutorial for camcorder issues. TV and VHS-C will have black at 7.5 IRE.
http://pro.jvc.com/pro/attributes/prodv/clips/blacksetup/JVC_DEMO.swfRecommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
binister,
I think there are several issues interacting here:
1. TVs treat signals differently from recording devices. The ADVC110 correctly (like most VCRs) adjust the gains of the entire signal with respect to the amplitude of the sync pulse, so if the sync pulse is too big or too small, it gains the entire signal to get it in range. Too large a sync pulse: picture will become darker. Too small a sync pulse: picture will become brighter. The ADVC110 does this very accurately - most TVs don't.
2. TBCs often replace sync pulses
3. Consumer camcorders often record the strangest things on to tape - non compliant levels, non compliant sync pulses etc.
It's not hard to imagine how to combine these three problems together to give exactly the situation you describe.
In the end, you just have to work with what you have. As long as it's not clipped beyond 0 or 255, you can make use of the headroom below 16 and above 235 which you can access via AVIsynth and other tools - just be sure not to clip it off before restoring it (i.e. don't go via RGB or DirectShowSource, don't use levels without cording=false etc).
Fixing it in analogue would be preferable, since scaling things digitally can introduce banding. However, given the amount of noise on your source, this isn't going to be a problem. If you do a simple shift (i.e. add 16 to everything) it won't be a problem anyway.
Cheers,
David. -
Originally Posted by 2Bdecided
Originally Posted by 2Bdecided
Also, if you need to convert to RGB for filtering (using VirtualDub filters for example) you can always use the PC.601 matrix to preserve levels: ConvertToRGB(matrix="PC.601"). -
Originally Posted by 2BdecidedRecommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Thanks guys... I really appreciate it. I just bit the bullet and boosted the troublesome footage digitally. Interestingly the footage that was shot with different camcorders did not suffer from the problem. I think the camcorder our family used for years was just not set up right or didn't add NA setup.
The end result looks acceptable. Now it is time to clean up the noise a bit...
Cheers
Similar Threads
-
Capturing From VHS Via VirtualDub Produce Strange Result
By max20d in forum CapturingReplies: 0Last Post: 8th Oct 2010, 22:55 -
Very strange VHS capturing problem..
By Veepa in forum CapturingReplies: 5Last Post: 3rd Apr 2010, 04:39 -
Strange sounding bass like noise when capturing from VHS. Help.
By Bricktop in forum CapturingReplies: 3Last Post: 9th Feb 2009, 09:21 -
Strange issue with audio capturing. Video is fine.
By m33ts4k0z in forum CapturingReplies: 0Last Post: 6th Nov 2008, 17:18 -
Capturing questions... IRE, analog->digital differences, and more!
By binister in forum Newbie / General discussionsReplies: 5Last Post: 20th Jan 2008, 15:33