VideoHelp Forum
+ Reply to Thread
Results 1 to 20 of 20
Thread
  1. Member
    Join Date
    May 2005
    Location
    San Diego, CA
    Search Comp PM
    I am capturing old VHS and VHS-C tapes in order to convert them to DVD. I am in the US and am using all NTSC hardware. I am using the following equipment:

    JVC SR-MV45 S-VHS/DVD combo VCR
    Canopus ADVC-110

    I am also using an old school crappy Sharp VC-A542 VCR for testing.

    Here is my problem... I went through the hassle of capturing about 70 hours of old movies to my computer with the JVC and Canopus units. I set the Canopus unit to 7.5 IRE and captured away. The problem is that after burning my first movie I noticed the video was very dark. I attached the JVC unit to my LCD and compared the original VHS tape to the DVD and the DVD was much darker than the VHS tape (which looked good). I then did a straight VHS->DVD dub using only the JVC unit and the resulting DVD looked exactly like the original VHS tape when played back on my LCD. So something in my capture process was messing up the setup.

    The capture->edit->burn process introduced a luma crush that I do not like.

    Now here is the very strange part... I output the color bars from the Canopus unit to my JVC VCR and saved the bars to a VHS tape. I then played the colors bar back on both the JVC unit *and* an old school 8+ years old VCR that I had laying around and captured the output to my computer. Both captured clips showed the correct stepping for the colors bars. They were clamped around digital 16. However, when playing my old VHS tapes back on both units the old VCR does not exhibit crushed blacks when captured however the JVC does exhibit crushed blacks. I don't understand how this can happen. Here are a couple of screen caps:

    Old VCR


    New VCR


    The scene I picked contains what I believe to be a pure black subject. The new VCR shows that subject is represented by digital 16 and the old VCR shows that subject is around digital 32. The crazy thing is that both VCRs result in that exact same output when displaying the color bars but completely different output when displaying my old VHS tapes. To be clear, the "Old VCR" image is the one that looks the best when displayed on my TV. The "New VCR" image looks extremely crushed.

    I initially thought "well maybe the old VHS tapes were recorded at IRE 0", after all the original video recorder is about 20 years old. However I dug up the recorder and it claims to conform to NTSC standards. The other really strange part is both VCRs display the same image (luma-wise) when playing back both the color bars and the video clips when connected directly to the TV.

    I am really at a loss here. All I want is a DVD that looks exactly like the source VHS tapes. The crushed blacks in my current DVDs are unacceptable. I just can't figure out where my setup is wrong. Any help would be appreciated. I can post pictures of the color bars if that would help.
    Quote Quote  
  2. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    I agree that oldvcrho6.png is showing an approximately correct black level and that newvcrgy9.png is showing crushed blacks.

    Note the bit of left edge blanking showing ~7.5ire below the level 16 marker for oldvcrho6.png. Whites appear to go to ~255 which is typical of consumer camcorder source. Broadcast/cable source will show a few to no spikes above 235.



    It might give a clue if you posted the actual playback frame caps (or a few frames of DV-AVI) for the recorded color bars for each machine.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  3. aBigMeanie aedipuss's Avatar
    Join Date
    Oct 2005
    Location
    666th portal
    Search Comp PM
    the entire color spectrum appears to have been shifted down in the new pic. are you sure it wasn't set to something weird like -7.5 by mistake?
    --
    "a lot of people are better dead" - prisoner KSC2-303
    Quote Quote  
  4. Member
    Join Date
    May 2005
    Location
    San Diego, CA
    Search Comp PM
    Originally Posted by aedipuss
    the entire color spectrum appears to have been shifted down in the new pic. are you sure it wasn't set to something weird like -7.5 by mistake?
    That is indeed what is happening on the new VCR but only when captured through the Canopus to my PC. I have made sure the Canopus IRE dip switch is set to 7.5IRE. If I change it to 0IRE then the resulting output looks like the Old VCR (but then the old VCR output looks incorrect).

    I will post some short caps of the color bars from both VCRs in a bit.

    Thanks for your help.
    Quote Quote  
  5. Member
    Join Date
    May 2005
    Location
    San Diego, CA
    Search Comp PM
    Ok here are some short AVI (Huffy) clips of the colors bars out of the Canopus as well as a more recent VHS recording that was made.

    http://rapidshare.de/files/39577800/clips.zip.html

    Color bars clip
    Description: Color bars output in 7.5IRE mode by the Canopus ADVC110.
    Recorded: Canopus ADVC110 in Color Bar mode -> JVC VCR via composite for recording.
    Captured: Using the new JVC VCR via S-Video and my old Sharp VCR via composite to the Canopus unit.
    Result: It looks like the old VCR results in a black level that is 8 digital points higher than the new VCR. The old VCR's bottom-most black point is right at 16, however the new VCR has its bottom-most black point at 8.

    TV clip
    Description: A TV clip recorded off of broadcast cable around 2001.
    Recorded: With a standard consumer VCR (I don't have it anymore).
    Captured: Using the new JVC VCR via S-Video and my old Sharp VCR via composite to the Canopus unit.
    Result: Similar to the color bars. The old VCR's levels are about 8 points higher than the new VCR.

    Camcorder clip
    Description: VHS camcorder footage from around 1987. Could be multigenerational.
    Recorded: With an old Zenith camcorder that was purchased back in 1983.
    Playback: On the new JVC VCR via S-Video and my old Sharp VCR via composite to the Canopus unit.
    Result: The old VCR is a full 16 levels higher than the new VCR.

    Questions:
    1. Would an S-Video connection make this much of a difference when compared to a composite connection?
    2. The JVC VCR is considered a "professional" grade deck so I would assume its output would be more "true" to the source however that is obviously not the case when capturing. The deck has probably seen 100-200 hours of playback, is that alot?
    3. The really strange thing is that both VCRs look virtually identical when played back on an LCD screen even if I put the Canopus between them. This black level crushing only seems to occur when capturing to my PC. Are the Canopus units know to have black level problems?[/url]
    Quote Quote  
  6. Member
    Join Date
    May 2005
    Location
    San Diego, CA
    Search Comp PM
    Here is another lead... When I disable the TBC/NR functionality on the JVC VCR the captured video is much better (brighter) and matches the source on the other old VCR. So it looks like the deck's TBC feature causes the luma to drop. The strange thing is that this drop only shows up in captured video as far as I can tell. When I hook the VCR up to an LCD TV enabling/disabling the TBC does not seem to affect the luma of the video.
    Quote Quote  
  7. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Strange. Does the JVC TBC have proc amp controls ? Or black level adjust in the menus ?
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  8. Member
    Join Date
    May 2005
    Location
    San Diego, CA
    Search Comp PM
    Nope, there are only a few options in the menu and they are things like Digital R3 NR, Video Stabilization, etc. Nothing about black level.

    Hmm, so would it be advisable to keep the TBC on and change the Canopus to 0IRE? That results in a very good picture.
    Quote Quote  
  9. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by binister
    Nope, there are only a few options in the menu and they are things like Digital R3 NR, Video Stabilization, etc. Nothing about black level.

    Hmm, so would it be advisable to keep the TBC on and change the Canopus to 0IRE? That results in a very good picture.
    If that is what the JVC is putting out use the 0 IRE setting. Of course any direct playback to an NTSC TV will from the JVC will show crushed blacks same as a DV camcorder. JVC should explain this. Call tech support. Ask them why they sell 0ire black output models in the USA? They probably sent you a Japanese model by mistake. Or maybe in the setup menus you can set USA or Japan in preferences.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  10. In my experience devices usually output much brighter video via composite (compared to s-video). Or is it that the TVs display composite much brighter than s-video? In any case, whenever I have had a device with both outputs, viewing via composite was always much brighter than with s-video.
    Quote Quote  
  11. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by jagabo
    In my experience devices usually output much brighter video via composite (compared to s-video). Or is it that the TVs display composite much brighter than s-video? In any case, whenever I have had a device with both outputs, viewing via composite was always much brighter than with s-video.
    Composite and S-Video should output with identical luminance values. It could be a TV setings issue. This is where a standalone waveform monitor is useful.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  12. Originally Posted by edDV
    Originally Posted by jagabo
    In my experience devices usually output much brighter video via composite (compared to s-video). Or is it that the TVs display composite much brighter than s-video? In any case, whenever I have had a device with both outputs, viewing via composite was always much brighter than with s-video.
    Composite and S-Video should output with identical luminance values. It could be a TV setings issue. This is where a standalone waveform monitor is useful.
    Yes, in theory they should be the same. But I've done it with many devices and many CRT based TVs (all consumer level, not pro) and always found the same thing: composite displays brighter. I'm pretty sure I've seen this capturing from my cable box with a Hauppauge PVR-250 too.
    Quote Quote  
  13. Member
    Join Date
    May 2005
    Location
    San Diego, CA
    Search Comp PM
    I don't think the JVC is the problem. If I attach it directly to my LCD TV the luma looks fine. I have narrowed the problem down to the use of the JVC's TBC in conjunction with the Canopus ADVC110.

    Doing the following:

    JVC (TBC enabled) -> Canopus (via composite) -> LCD TV (via composite)

    I see the same drop in luma across the board. So for whatever reason the signal being generated by the JVC when TBC is enabled is causing a problem in the Canopus. I also noticed the picture quality of the tapes is degraded when going through the Canopus. Is this normal after an analog->digital->analog conversion?
    Quote Quote  
  14. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by binister
    I don't think the JVC is the problem. If I attach it directly to my LCD TV the luma looks fine. I have narrowed the problem down to the use of the JVC's TBC in conjunction with the Canopus ADVC110.

    Doing the following:

    JVC (TBC enabled) -> Canopus (via composite) -> LCD TV (via composite)

    I see the same drop in luma across the board.
    But with TBC off you get good blacks and with TBC on you get the waveforms above?

    I think the JVC output level (internal adjustment) is low through the TBC path. That is what the waveforms show. I also think the TV is auto adjusting brightness back up so you don't notice the low blacks as much. This is what separates a broadcast monitor from a TV. The broadcast monitor is supposed to show the reality of the signal (warts and all) so that you can visually fix the problem. A TV trys to improve and conpensate problems through various processing features. It isn't possible to turn all these off in the TV menus. The VCR->Canopus ADVC-> DV capture path should not alter reality so long as it is switched to 7.5ire.

    Originally Posted by binister
    So for whatever reason the signal being generated by the JVC when TBC is enabled is causing a problem in the Canopus. I also noticed the picture quality of the tapes is degraded when going through the Canopus. Is this normal after an analog->digital->analog conversion?
    We have demonstrated (I think) that there is no black shift with the JVC TBC turned off and there is a black shift (crushed) when you turn the TBC on. This indicates internal adjustment problems in the VCR (specifically black level aka "brightness").

    To fully compare the JVC to LCD TV path to the JVC->Canopus ADVC-> DV file path, you can't use the computer monitor. You need to play the DV file back through the Canopus ADVC to the same LCD TV.

    DV file->Canopus ADVC-> LCD-TV (S-Video or Composite)

    As said above, the LCD-TV is not a good instrument for measuring levels but the pictures should look similar to the direct connection VCR-> LCD-TV (S-Video or Composite).

    Is this JVC VCR new? If so you should ask JVC support if black shift is normal when turing on the TBC. Also ask if this can be fixed under warranty.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  15. Member
    Join Date
    May 2005
    Location
    San Diego, CA
    Search Comp PM
    Originally Posted by edDV
    But with TBC off you get good blacks and with TBC on you get the waveforms above?
    JVC->Canopus@7.5IRE->TV (Old camcorder footage)
    TBC ON: Significant drop in luma
    TBC OFF: Correct luma

    JVC->Canopus@7.5IRE->TV (Broadcast footage)
    TBC ON: Correct luma
    TBC OFF: Very very slight drop in luma but still looks good

    JVC->TV (Old camcorder footage)
    TBC ON: Correct luma
    TBC OFF: Correct luma

    JVC->TV (Broadcast footage)
    TBC ON: Correct luma
    TBC OFF: Very very slight drop in luma but still looks good

    Originally Posted by edDV
    I also think the TV is auto adjusting brightness back up so you don't notice the low blacks as much.
    I think this is the case with my LCD TV as the black level on it is different when compared to an old CRT I have.

    Originally Posted by edDV
    Is this JVC VCR new? If so you should ask JVC support if black shift is normal when turing on the TBC. Also ask if this can be fixed under warranty.
    Yep, the VCR was purchased new at the beginning of this year. I have a call out to JVC to find out about its black level outputs.

    After more testing I think I have determined that my old camcorder tapes were recorded without any NA NTSC setup added. Almost all of the camcorder footage has significant bleeding into the 0-16 digital range when captured at 7.5IRE. The broadcast footage that also happened to be on some of the tapes does not suffer from the bleeding.

    Since the camcorder footage was all shot on a circa-1982 Zenith camcorder I am going to assume either it didn't have its IRE switch set to 7.5 or it just recorded everything at 0 IRE by default. My solution is to bump up the luma in AviSynth by 16 points since I captured all of it @ 7.5 IRE.

    One additional question I have... Many of these tapes were dubbed from VHS-C tapes to full VHS tapes using two old VCRs (1980s). Would it be possible for the IRE levels to get changed during that dubbing process if any IRE switches on the VCRs weren't set correctly?

    Thanks again for your help. I called around to some local video shops to see if anyone had an analog waveform monitor that I could use to check the IRE levels on these camcorder tapes but nobody in a 25 mile radius even own a single piece of analog equipment!
    Quote Quote  
  16. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Complex issues but I think you have all the clues in there. I'll get back to this later today.

    FYI: Consumer DV camcorders output at 0 IRE analog and analog pass through 7.5 IRE to digital level 32. Watch this JVC black level tutorial for camcorder issues. TV and VHS-C will have black at 7.5 IRE.
    http://pro.jvc.com/pro/attributes/prodv/clips/blacksetup/JVC_DEMO.swf
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  17. Member 2Bdecided's Avatar
    Join Date
    Nov 2007
    Location
    United Kingdom
    Search Comp PM
    binister,

    I think there are several issues interacting here:

    1. TVs treat signals differently from recording devices. The ADVC110 correctly (like most VCRs) adjust the gains of the entire signal with respect to the amplitude of the sync pulse, so if the sync pulse is too big or too small, it gains the entire signal to get it in range. Too large a sync pulse: picture will become darker. Too small a sync pulse: picture will become brighter. The ADVC110 does this very accurately - most TVs don't.

    2. TBCs often replace sync pulses

    3. Consumer camcorders often record the strangest things on to tape - non compliant levels, non compliant sync pulses etc.

    It's not hard to imagine how to combine these three problems together to give exactly the situation you describe.

    In the end, you just have to work with what you have. As long as it's not clipped beyond 0 or 255, you can make use of the headroom below 16 and above 235 which you can access via AVIsynth and other tools - just be sure not to clip it off before restoring it (i.e. don't go via RGB or DirectShowSource, don't use levels without cording=false etc).

    Fixing it in analogue would be preferable, since scaling things digitally can introduce banding. However, given the amount of noise on your source, this isn't going to be a problem. If you do a simple shift (i.e. add 16 to everything) it won't be a problem anyway.

    Cheers,
    David.
    Quote Quote  
  18. Originally Posted by 2Bdecided
    3. Consumer camcorders often record the strangest things on to tape - non compliant levels, non compliant sync pulses etc...

    In the end, you just have to work with what you have.
    I agree. Consumer devices are often out of spec. And when working with bad sources you do what works, not what is technically correct.

    Originally Posted by 2Bdecided
    (i.e. don't go via RGB or DirectShowSource, don't use levels without cording=false etc).
    I think DirectShowSource() is undeserving of its bad reputation. If you use a decoder that outputs YUV there is no problem (as far as black/white crush is concerned). I suspect the bad reputation comes, at least in part, from the use of the Panasonic DV codec which always decodes to RGB with rec601 luma expansion. You can always check the colorspace with AviSynth's Info() command.

    Also, if you need to convert to RGB for filtering (using VirtualDub filters for example) you can always use the PC.601 matrix to preserve levels: ConvertToRGB(matrix="PC.601").
    Quote Quote  
  19. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by 2Bdecided

    2. TBCs often replace sync pulses
    That could explain the level shift when the JVC TBC is turned on.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  20. Member
    Join Date
    May 2005
    Location
    San Diego, CA
    Search Comp PM
    Thanks guys... I really appreciate it. I just bit the bullet and boosted the troublesome footage digitally. Interestingly the footage that was shot with different camcorders did not suffer from the problem. I think the camcorder our family used for years was just not set up right or didn't add NA setup.

    The end result looks acceptable. Now it is time to clean up the noise a bit...

    Cheers
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!