I am trying to preserve some old Super8 and VHS home movies to a digital format. I could use the MPEG 2 authoring software that came with the capture card, but it seems too limiting. For example, I only have channel of audio and the software is not sophisticated enough to duplicate it mono for both left and right speakers...so I end up only having audio on one speaker. I guess I could just buy a $2 part on monoprice to do that and maybe that's what I will do. I also experimented with VirtualDub and VirtualVCR. I could not get VirtualDub to work, but VirtualVCR does. However, the capture uses about 10 MB / second since it captures in uncompressed 720 x 420 (I think that was the only format supported by the capture device). I like the idea of capturing in an uncompressed format so that later I can always re-encode in the latest format or filter without any loss of quality due to compression / decompression cycles. However, is it really worth it? Or am I just spending too much energy trying to preserve noise? Is MPEG2 good enough or should I store the content in a better codec?
I also have a Diamond VC500 capture card that does work in VirutalDub, but it is over USB 2.0 interface. I bought the HD750 for the higher PCI-E bandwidth and ability to do uncompressed captures. But maybe this is overkill...what do you all think?
+ Reply to Thread
Results 1 to 16 of 16
Note the capitalisation, DB83. Should be more like 20MBps though.
USB 2.0 bandwidth is greater than required for uncompressed 4:2:2, and indeed the VC500 can do that.
Install Ut Video Codec and capture lossless instead of uncompressed. Once you have a capture you are then free to filter if you want and encode to your delivery format(s) of choice.
Uncompressed 8-bit standard definition video is on the order of 20 MB per/sec, or 70 GB per/hr, so if your captures use 10 GB/sec, some kind of compression is being applied to your captures, maybe one of the lossless codecs. If you plan to correct defects and clean up noise in your capture using avisynth or virtualdub, then capture using a lossless codec. If not, MPEG-2 capture may be good enough.
I'm not sure how to use the Ut Video Codec with VirtualVCR. With VirtualDub, it's pretty straightforward, but I can't get the HD750 to work with VirtualDub. I suppose that if the VC500 produces the same quality video / audio, I could just use that. Although the product number implies that the HD750 is a better capture device. Does anyone know how to use 3rd party codecs with VirtualVCR?
The model numbers don't have anything to do with one another; for one thing the 750's name is inherited from ATI/AMD, a totally different company.
If you avoid automatic gain control issues, I would say that the 750 is better though.
What's the problem in VirtualDub? Set Video -> Preview if it's just a case of getting a black screen.
With VirtualDub, I get a black screen even after doing Video -> Preview.
I installed the Ut Video Codec and I can see the codecs in Virtual VCR. So I can capture the video using that. For VHS and Super8 which of the UtVideo codecs should I use? The Diamond HD 750 PCI-E seems to capture natively at 720 x 480. I see a number of choices such as:
UtVideo YUV420 Bt.709 (ULH0) VCM
UtVideo YUV422 BT.709 (ULH2) VCM
UtVideo RGBA (ULRA) VCM
Should I try to save file size by downsizing the resolution on the fly? Is that possible? Let me know if there is a better codec to use, I'm open.
Use YUV422 BT.601.
Should I use:
UtVideo YUV422 BT.601 (ULY2) DMO or
UtVideo YUV422 BT.601 (ULY2) VCM?
It takes about 10 MB/sec when using the compression codecs above. The output looks good, same as what I see through the viewfinder. However, are there some filters that I should apply? It looks like some anti-aliasing filter would help. The video looks pretty grainy. Once I post-process, how do I burn to a DVD?
VCM or DMO don't make any difference as far as quality or anything. DMO is the new DirectShow technology so it may perform .1% better.
Not sure why you would use antialiasing. Are you unfamiliar with interlacing?
I think people around here recommend AVStoDVD as a free DVD authoring app. Set it up to HCEnc. Paid software includes TMPGEnc Authoring Works etc.
Ok, I'll go with DMO. You're right on the interlacing...I found this helpful:
Looks like I'm best off by using the referenced Deinterlace - Smooth codec.
EDIT: I did some additional reading. It appears that DVD is interlaced. If I plan to play the DVD on a modern Blu-Ray player connected to a 1080p HDTV, should I convert the video to progressive scan or leave as interlaced? (I would prefer to burn to Blu-Ray, but I don't have a Blu-Ray burner).
Last edited by LostEncoder; 2nd Jan 2014 at 12:03.
Authored Blu-Ray only supports interlaced standard definition content.
Just my opinion, but interlaced standard definition video should not be converted to progressive unless there is a good reason for it. HDTVs can deinterlace, and so can most hardware and software players.
Definitely leave it interlaced.
Simple BD authoring programs like tsmuxer do not enforce this, if it even is a true requirement, and all the BD players I've tried will play 480p video on a BD disc authored with tsmuxer. But each person must make their own decision here and what I'm saying should be considered that I am working from memory and I have not provided any conclusive proof that what I say is right about 480p video being allowed for BD.
Real Blu-Ray authoring software could be a bit more picky than freeware. Blu-Ray players could be flexible about what they play, but are only under the obligation to meet the Blu-Ray specification.
My logic is to first preserve the S-Video / VHS content in the highest quality possible in a digital format that will not degrade over time. I plan to keep digital copy on a hard drive in unencrypted format. Second objective is to burn it to DVD and possibly apply filtering to make the video / audio better. It appears that there is no need to filter or convert the format. Are there any filters that one would recommend applying to S-Video / VHS content (to enhance quality) to view on DVD combined with HDTV (1080p)? Sounds like the answer is no based on the no free lunch theory.