Slowly working my way through digitisation of VCR collection. All have now been captured ... most are captures of the original .. however on 2 they were 'purchased' professional videographers output ....
So what I have is certainly not the original ... and with VHS generations that means Chroma problems and a whole heap of noise problems.
I'm hoping the experts here can take a look at the attached 2 samples of the first video, and tell me what I could do to improve.
Be very keen to understand the steps of what & how you analyze what needs fixing - keen to learn.
Almost a case of what is the workflow to follow.
The colours are washed out, there is a significant amount of Red Bleed ... very apparent toward end of first clip.
Plus on the whites (buildings and white belts) there are blue splashes & flashes ... and in 2nd clip an unnatural sky colour.
+ Reply to Thread
Results 1 to 30 of 94
First mistake (IMHO): capturing VHS video as BFF into lossy DV using a SONY codec. A quality loss at the outset and possible chroma problems. If this vid was originally taken with a SONY camera, that might be another story. But it looks like VHS to me.
Others might have a different take on this, but that's where I'd start.Our inventions are wont to be pretty toys, which distract our attention from serious things. They are but improved means to an unimproved end. -- Henry David Thoreau
I go about it like this:
(Fast workflow) - ADVC or similar Canopus/GV box or card into DV-AVI file, using WinDV, Scenalizer, or Sony Vegas capture tools, hopefully using Cedocida DV codec, light processing, then conversion to DVD, etc.
(Slow workflow) - Analog in (S-Video if possible, Composite if not) to losslessly compressed 4:4:4 or 4:2:2 AVI file (Lagarith usually), usually using card's capture software, lots and lots of processing & restoration (mainly using AVISynth), while continuing to use LL codecs up to and including the final end master. Then compress to common distribution formats for consumption.
If you've already got multi-generational problems on your VHS masters, IIWY, I'd lean toward the Slow workflow.
I'm sure Lordsmurf has some good nuggets of wisdom for you as well...
I am not sure what I did wrong here ? can you explain further.
My steps were a capture of VCR via TBC and ADVC300 into a PC running WINDV
I thought that a straight capture into AVI was the best & cleanest way to start ?
I did not select any codec ... simply WINDV set to TypeII AVI
AVI is the container. If you are using WinDV, you must have used some flavor of DV codec. If you didn't custom install one of your own (Cedocida hopefully), you are using MS's or Sony's system default DV codec (so-so quality).
Maybe you have ProcAmp settings on the TBC wrong, maybe you have the ADVC hardware switches set wrong. Hard to tell. What TBC are you using? What connections? What are your switch settings on the ADVC?
Of course, with multi-gen VHS, you are GOING to get a significant amount of red bleed anyway, as well as luminance bloom.
Must have been system default Codec
As you know, I also have one of those boxes that LordSmurf loves to hate aka an ADVC300. I only have a few captures on my system at present and one I def recall using WinDV but there is no trace of a Sony Codec in that.
May be mistaken but I thought that Avisynth, if you plan to use it rather than clean up as best in Vegas, only accepts Type 1 DV. Of course you can convert it.
DV is a second best since it is lossy. Not as lossy as Mpeg2 but still lossy.
I used an ADVC 300 all captures were done with WINDV, not sure where the SONY codec came in.
I'll be honest I did not know I had to set any codec ... and that you just used WINDV as is.
However realistically is there going to be a huge loss by me having used SONY codec ?
Well if you no longer have the equipmnt (from other topic) you can hardly do much about it (when I said 'no trace' I meant that it is not showing up in mediainfo like yours is but I do have Canopus DV codecs installed as well. Maybe the ADVC uses the Sony codec as default)
With my eyesight (and you know how bad that is) the clips are not that bad and I suspect that they can be cleaned up. Jeez, these guys worked magic with a really crappy yt vid the other day and these have much more quality IMO than that)
Yep - h/w is no more, so the captures are what they are. (unfortunately)
I tried to follow advice here - went and bought ADVC-300 and AVT-8710 TBC ... captured all with WINDV .......... obviously I missed a step with the codec.
Would appreciate any help on how I could improve the video ......... know there are some pretty knowledgeable guys here.
Well, about the sony codec...whatever. I wouldn't capture VHS to lossy DV to begin with (whether at a high bitrate or not) , but that's just me. Perhaps the cap device did use a Sony codec, I just took a cue from MediaInfo:
Complete name : E:\forum\tafflad2\Sandhurst - Clip 1.avi
Format : AVI
Format/Info : Audio Video Interleave
Commercial name : DV
ID : 0
Format : DV
Codec ID : dvsd
Codec ID/Hint : Sony
Complete name : E:\forum\tafflad2\Sandhurst - Clip 2.avi
Format : AVI
Format/Info : Audio Video Interleave
Commercial name : DV
ID : 0
Format : DV
Codec ID : dvsd
Codec ID/Hint : Sony
The blue isn't too difficult, but that hot Red up around RGB 175 is just nasty. Now, if I could just get a handle on masking techniques....
The color has pretty much turned to garbage. I played with it for a while, but will have to play some more tonite. At least there's not a lot of fuzzy noise and grain to worry about, but detail looks kinda lame and the levels are a mess. And there's that top-border flashing again! As they used to say: bummer.
Our inventions are wont to be pretty toys, which distract our attention from serious things. They are but improved means to an unimproved end. -- Henry David Thoreau
I appreciate any help you can give
ADVC is a hardware device, so it uses the "codecs" that are built into its firmware. These happen to be Canopus codecs, which are better than average (and IMO, better than MS & Sony & Panasonic's) but maybe not the very best. The stream that is being captured by WinDV is ALREADY PRE-ENCODED to DV with that codec. WinDV's job is just to receive the incoming stream and to save it into an AVI container(whether Type1 or Type2).
When you DECODE (incl. edit, play, convert) these captured AVI files, it will show as having a particular codec. But this codec is NOT that canopus codec from the ADVC, it will just show whatever is the existing default system DV codec installed on that particular machine. This could be MS's, or Sony's, or possibly something else (though that's rare). The files obviously aren't ENCODED by those codecs, but, unless you install new/replacement/additional DV codecs (like Canopus or Cedocida), the system has already mapped the "DVSD" fourcc to the default for decoding. That MAY be where error is creeping in, though I doubt it.
Note that most post-installations of the Canopus DV codec DO NOT replace the mapping of DVSD, they just add a new fourcc mapping of "CDVC". One would have to use a codec filter manager to correctly modify this to your liking.
@Tafflad, if you are seeing the codec as Sony, that's because your current system default of 'DVSD' is mapped to an installed Sony DV codec. It was probably pre-installed with some other software (unless you're using a VAIO, etc - then it would be the OEM system DV codec). Overall, this whole "sony codec" thing is a red herring. Your main problems had more to do with wild TBC/ProcAmp or ADVC settings. (or the source tape was already this crappy-looking to begin with)
A listing of MediaInfo output, as well as thorough device model #s and settings, would probably help...
p.s. You don't have the source tapes anymore either?
Your AVT-8710 has (HAD) controlls for: Brightness, Contrast, Color, Tint, Sharpness, as well as a switch for AGC on/on. My guess is that those controls were set wrong (all up?). That's the "proc amp" I was talking about.
p.s. Since you live in UK (PAL country), ADVC settings you chose WERE CORRECT.
<edit>"straight up" might be default unity, but that doesn't mean those controls were BYPASSED (their factory default/detent could have been off). And I have a feeling AGC was switched on...
Last edited by Cornucopia; 8th Oct 2013 at 16:57.
Agree, also on the ADVC-300 I could have changed a lot of settings via the panel switches or via the GUI interface (Picture Controller 300) but did not set any of them.
You can sharpen the colors a bit with something like MergeChroma(last, aWarpSharp(depth=20)). Then align them better with the luma with ChromaShift(c=4, l=-2). A little levels adjustment is probably called for ColorYUV(gain_y=16, off_y=-16).
The reds still seem a bit overdone and bleeds on the right edge. I didn't bother treating the video as interlaced, you should. There's another problem where the chroma misalignment is different for each field.
I'm not that worried about the codec, but thanks for the additional info (most of which I knew anyway about passing DV to the PC, but it might help the O.P.). The original video was obviously photo'd to tape and apparently duped to tape later with way too much noise reduction (DNR has its upside, folks, and its downside. Too much of it is all downside).
I'm still playing with that horrible red but I'll have to stop for a while and forage for food. Meanwhile just playing with levels alone made an improvement. I don't know how much of the problem the copy device itself was, but the vids do have a familiar "look" to them.
Ed: thanks for that, jagabo. Yeah, MergeChroma with awarpsharp was a big help, but I had suspected that misalignment business just before I saw your post. Will have a closer look at that (not that I know what to do about it, but a closer look will tell me more). Thanks.Our inventions are wont to be pretty toys, which distract our attention from serious things. They are but improved means to an unimproved end. -- Henry David Thoreau
After sharpening the chroma I shifted it four pixels to the right and two pixels up.
jagabo. A quick question.
On many topics I read words to the effect "If your source is interlaced then it should remain so". Surely de-interlacing is going to remove definition especially when the final output will be interlaced.
I agree with de-interlacing for a progressive medium but dvd is not usually so.
Some problems you can have is that this ends up treating scan lines of a field as if they are adjacent, when they really aren't. The problem becomes visible with some types of filters. For example, SeparateFields().LanczosResize(width*4, height*2).Weave() can give nasty artifacts on sharp horizontal edges.
But the "leave interlaced video interlaced" advice is somewhat outdated. QTGMC is so good, and filtering progressive frames is so much better than filtering interlaced frames (or even separated fields), that you often benefit from using QTGMC, processing the video as progressive, then reeinterlacing at the end for DVD.
Last edited by jagabo; 8th Oct 2013 at 18:18.
Following the principle of "do the least damage", leaving interlaced as interlaced throughout the processing chain is often desirable. However, if:
1. Your final destination display is progressive with bad deinterlacing algorithms, or
2. Your intended intermediate processing steps don't fully support interlaced mode processing, or
3. Your interlaced footage is being mixed in with mainly progressive footage into a final progressive title,
you would want to deinterlace early on in the process. For those deints that are temporary, you can separate the fields into frames, work on them as frames, then rework them into their original field order after processing.
End goal is a DVD that can be played in standalone DVD player connected to TV
My DVD player is set for 16:9 Progressive, in case it's releavnt it is connected using Componenet video to a Home Cinema amp and that is connected to LCD TV via HDMI.
However this is partly for my Parents and their DVD player is not connected Progressive and has straight SCART connection to LCD TV.
You're worrying needlessly over interlace/telecine, etc. All home playback hardware is deigned to handle interlaced video. Almost all video broadcasts over cable and OTA are interlaced. If your players or TV have a problem with that, it would be consistent over the entire system because all players and all TV's are not "equal" in the way they handle common video sources. A TV that can't manage interlace/telecine material well will not handle motion or other factors on progressive very well, either.
Computer monitors and most computer software are progressive; they don't know interlaced from progressive or vice-versa. Computer Media players are designed to playback interlaced material properly, the way your set top players and TVs do it. Like your set top equipment, all media players are not equally competent in this area. For example, VLC Player doesn't deinterlace by default -- you have to set deinterlacing in its preferences menus. Other media players differ as well.
If you buy a retail tape or DVD, they more than likely use some form of interlacing. BluRay differs: some BD formats are interlaced by design, some are not. It's up to your BD players to detect the method used and to playback properly.
UTube videos are designed for progressive PC playback with browsers. They deinterlace everything if it isn't already purely progressive, and they usually do a horrible job of it. This is why many users, if they target something like uTube as their delivery format, will convert their intended display video to progressive structure rather than let uTube screw it up.
Right, I understand that. I don't refer to processing, but to the final output. With these two clips, there's no way it could be cleaned up without deinterlace/reinterlace, if for no other reasons than the top border lightning flashes, dot crawl, and horrible defintion.
I'm happy to follow any steps, as long as you guys can point out the sequence ....( or ideally show me what needs to be in a script.)
Working up a few ideas now . . . . .
Jagabo if off to a good start in post #17.