An upscaler doesn't necessarily improve the video, it's just a display format converterOriginally Posted by jzmax
Every HDTV sold has a built in upscaler for it's composite, S-Video and analog component inputs. CRT HD sets can display these inputs directly at 720x480 as they often do for progressive DVD. They can display interlaced or progressive and upscale to 1080 or some other number lines.
A LCD or plasma TV must convert the analog input to the native monitor resolution (say 1024x856, 1280x720 or any other number) and deinterlace it. This too is all built in.
An external upscaler attempts to do something similar but it knows nothing about the TV's native display or native resolution, it just knows about HDTV input connections like 720p or 1080i. These boxes often make the image worse than just plugging the S-video cable into the TV. Why upscale VHS to 1920x1080i only to have the TV downscale it to it's internal display resolution which almost always is far less than 1920x1080?
We have been discussing capturing analog sources at highest quality for preservation in an intermediate format that can be easily encoded with a future HD or SD codec. Future codecs will allow either greater quality or higher compression than today's DVD MPeg2.
Results 91 to 109 of 109
Recommends: Kiva.org - Loans that change lives.
I'm not 100% comfortable with the dvd recorder route (not enough control), so...
I already have what I consider a very good capture device, an HP branded Bali USB PVR which uses the highly acclaimed Broadcom BCM7040 chip (also used in Tivo units). Let me know if anyone has heard otherwise regarding this chip. This I will set at 15mb I-frame ONLY.
Additionally, I am going this week to purchase the HC90 with analog/digital passthru (as I can't find the Panasonic GS400 anywhere). I will try to post some comparisons shortly (in a week or so) or at least let you know narratively what my findings have been.
For what it is worth ... here are some screen grabs from a MPEG-2 capture. This is a 15,000kbps CBR "I" Frame only capture using the ADS Instant DVD 2.0 (a USB2 capture device with hardware MPEG encoding). The source is the pre-record WS VHS of ARMAGEDDON straight out from the VCR (Toshiba 6 head Hi-Fi Stereo model) to the ADS Instant DVD 2.0 using composite video. The MPEG capture was loaded into VirtualDubMod and images were copied and pasted into PhotoShop. I then used the SAVE FOR WEB feature to make these JPG images and I tried to get them as close to 100kb as I could since that is the max file size (all are between 90kb and 100kb).
Remember this is a VHS video source.
Here are a few more from cable TV ... the first two are the National Geographic channel (digital cable channel) the second two are from THE HISTORY CHANNEL which is an analog channel even though it went through the "digital" cable box. These two different captures were done "normal style" or in other words direct to a specific DVD spec (I, B, P format) MPEG-2 at 6000kbps VBR. The pics are "I" frames taken from VirtualDubMod to PhotoShop to JPEG (as above). The connection ... Digital cable box to ADS Instant DVD 2.0 with composite video (no S-Video out on my digital box thanks Comcast).
I had an INVADER ZIM capture (for those that wanted to see a CARTOON) but I deleted it by accident ... DOH !
- John "FulciLives" Coleman"The eyes are the first thing that you have to destroy ... because they have seen too many bad things" - Lucio Fulci
EXPLORE THE FILMS OF LUCIO FULCI - THE MAESTRO OF GORE
I set my capture device at 15mb I-frame ONLY and it seemed to hold its own. What I wanted to tell you is that it sent my Media Center PC into outer space. Live TV just starting going wacko, freezing, cracking, stuttering. When I put it back to the defaults it worked fine. Don't know what that means, just thought I'd report it.
Got a hold of a new Canon Optura instead of the HC90. Will take footage of the Giants game in a couple of hours to make sure it works. Then go ahead and try the analog-digital passthru on some VHS tapes and compare to the PC capture card route.
In a way, I'd say lordsmurf is at least partially correct. I have relatives in 6 states, know dozens of my neighbors, have visited hundreds of homes doing PC repairs and often hooking up TV setups. I've seen only 2 BluRay players, so far, and no 3D whatsoever.
Last edited by sanlyn; 21st Mar 2014 at 09:04.
There is some mis-information that needs to be explained first.
VHS is an ANALOG format, not digital. This is to your advantage.
1. VHS Tape CAN record and playback resolutions higher than 300-400 lines. The standard is ~540 for NTSC and ~570 for PAL. The limitations of VHS is the cross magnetization of the tape and tape heads. The second limitation is the electronics of the VCR. In reality, a VHS or even better Betamax tape will record whatever you give it. These are ANALOG formats, not digital, and there is NO set number of lines; The VCR requires NTSC standard formatted video, which has a fixed horizontal and vertical frequency, and 29.97 frames per second, up to thirty. Pal is 25. There is a color subcarrier and side bands which contain the color portions of your video, (simplified explanation), which operates in the same concept as FM stereo. NTSC uses 3.57 and PAL 4.5 mhz center carriers.
A fully analog VCR has no way of distinguishing between 350 lines and 720 lines, as long as the horizontal and vertical frequency are correct. When the head rotates against the tape, it records what is sent to it! Some newer VCRs have digitizers designed to remove noise, and these may show a limitation. But your average 1980's-mid 1990's VHS does not digitize. Try a Sharp 19 micron head model. I have had good luck with Surveillance Time Lapse recorders that have good heads, these were natively designed to play and record hi-res analog. Note, however, that these don't have Hi-Fi heads and usually only can play SP tapes. For low speed tapes, I highly recommend Matsushita (Panasonic type) VCRs from the early 1980's. These had special equalization circuits that maximized the quality of EP / LP tapes.
Consumer Betamax tape improves the black and white portion of the video. The color portion is down converted, so it will wash more; betamax is very much like an analog version of mpeg video, but in that it is analog, there are no blocks, only washing over. The main advantage of the betamax system is that early VCRs used a rotating pickup, but fixed head drum. Only the read heads made contact with the tape, the metal drum was at a fixed position. This did not re-polish the tape, but eliminated horizontal jiggle, commonly found on VHS and other formats.
What your picture looks like when it is overscanned to 1920 X 1080 depends on three factors:
A. The Source
A VHS tape (Or any other analog video tape like Betamax, Betacam, Betacam SP, U-Matic or U-Matic SP to name a few) is only as good as the source video. If you recorded movies off of Cable TV using a cable box, you have noise and washing from the cable box, and amplifiers in the cable system. Many times in the 1980's, Cable did NOT have stereo audio, even though the source was stereo. The cable box coax out usually did not have an MTS modulator, and thus, you were getting mono audio. Better cable systems had stereo on local channels, but most did not.
B. The cameras used
If you use Laser Disc as a source, there are many variations in quality. The biggest variables in conversion to laser disc are the recorder used to make the disc, and the camera that transfered the video from the film. Beginning in 1986, bigger studios used Sony DIGITAL video recorders with resolutions of 1280 x 720 to transfer film to a master tape. This was uncompressed digital video, and was considered near the best available in 1986. The source for these recorders was a high definition camera(they called it correctly high resolution or double scan or triple scan camera). In most cases, the overscan video was nothing more than NTSC with more lines in analog, though in some very limited uses, they had a parallel digital connection from the camera to the recorder. Remember, in 1986, nobody could really tell the difference because monitors were usually the standard 525 line tube types. There were specialty wide screen tubes, and with rear projection TVs being very common, the higher resolution was visible to the high-end user. Mitsubishi made high resolution big tube sets with capability of 720P in 1986!
The quality of Laser discs depends greatly on the source. Many were SVHS, U-Matic or Betacam tapes, many use 1 inch open reel. Pre-recorded VHS and Betamax tapes in the 1980's and early 1990's were made from either Betacam or Laser Disc. Thus, the tapes are only as good as what you put on them.
C. Over The Air TV and Cameras
The best television for overscanning was made in the 1960's and 1970's on 3/4 inch, 1 inch, and 2 inch tape. In big network productions, like the Ed Sullivan Show, for example, they used native overscan tubes - three gun systems, which scanned double the resolution of normal TV broadcast, having 1050 horizontal lines. This made it easier to line up the Red Green and Blue tubes, and greatly improved the picture for the majority black and white viewers. In that era, there were NO digital time base correctors, everything was in sync. Primitive time base correction was an ANALOG hard disc drive or tape delay loop.
In the 1980's most major TV stations used a digital time base corrector, which stored a few frames in video and would reject and re-time half frames from camera switching or video tape. TV stations in the 70's had Video Out, and Sync IN, meaning that all on air cameras and and video recorders had to operate on the same time base, or you would have picture rolls. Smaller market stations often had flipping, since digital time base correctors were way out of the budget. The biggest loss in the signal was the digital time base corrector, which looked great at the time, but many were limited to 540 lines, some had double resolution, maybe the big networks. The big networks like ABC CBS and NBC were still using a sync system.
Thus, digital degraded the analog signal, and overscanning will only show the maximum number of lines that the TBC put out. Many tube type cameras of that era did use overscanning, which means that footage from the camera to the betacam or u-matic tape will be able to be overscanned.
2. The source material usually determines what an overscanned VHS or Beta tape looks like, not the medium. Don't believe me? Try the output from an HDTV Convertor box, model STB7766C sold under the brand names RCA and Venturer (STB7766G). The built in Broadcom chipset puts out HD on the composite output, with the additional lines fit into the composite signal. There are several other manufacturers that do this as well. If you use good tape on an SVHS VCR or Betamax, you should get a pretty good recording off of this in high def. Your tape and machine determines how grainy the picture looks. The better the tape and smaller the particles, the better the picture. The same that applies to film applies to video tape.
It is possible to get High Definition video from a composite NTSC video signal. So far, the biggest problems I have had have been wide screen adaptation and comb filter issues.
But anyone that says that you cannot get high definition from VHS in analog is wrong!!!
Ironically, the worst video you can watch is 1990's DVCAM stuff, that used lossy mpeg2. Overscanned / upscaled, it looks like total crap, especially with the primitive video encoders of the 1990's. To see better resolution you have to go back to the 1980's. DVCAM SD looks like crap when upscaled, compared to analog Betacam SP or U-Matic.
What is funny is looking at how bad HD was in the 1990's and 2000's. I have some HD off air digital captures from 2004 that look just aweful in comparison to modern HD video in 2014. Using the SAME decoder / capture card too! We had a format, but we did not have CPUs fast enough to encode it in the 1990's.
Want to improve the quality of your VHS overscan project? Denoise, and then renoise lightly in digital, with small pixel noise at a low level. The same applies to video as does audio!
First of all, this thread is nearly 10 years old so the situation has likely passed
Hypothetical non-standard ways to record video signals to a VHS tape aren't very interesting if no one has such tapes to convert...
Well, Blu-ray may have caught on for a while with commercial replay only discs, but it never did reach 'critical mass' to allow it to become a mainstream, cheap recordable format, and is now slipping away as streaming and solid state replay alternatives start to dominate the market.
In a desperate attempt to try and keep up , they have released a modified version of the Blu-ray spec to allow 1080/60p replay, but if we're honest it was a bit too late to make any real difference to the slow demise of the format.
Even Sony - who essentially control Blu-ray - stopped making Blu-ray burners in 2013, when they closed Optiarc. Bit of a clue there probably.....
Blu-ray never really lived up to what was initially expected of it, and as all optical discs slowly become relics of a past era, Blu-ray will probably fade away more quickly than either CD or DVD has done...
Last edited by pippas; 10th Jun 2016 at 10:17.
So Sony stopped making blu-ray burners...they are still being sold like crazy, manufactured by other companies. So for those of us who burn blu-rays, it is going to be around for awhile.
I can agree Blu-Ray will fade away faster than CD/DVD, as those formats are over 40 years old, and I can see that downloading, streaming and flash drives ARE the wave of the future. And as you all have noticed, as technology increases, the rate at which it increases is faster than before as well.
It took what....70-80 years for SD to go the way of the Do-do bird? We have been dealing with 1080p HD video on a wide scale for a max of 20 years? 4k is next for consumer TV formats....how long before they phase those out in favor of the next format? 8k? 16k?
What has helped accelerate the demise of blu-ray has been the massive increase in internet download speeds, cheap storage, and many advances in the quality, speed and saturation of platforms.
I would even go so far as to say smartphones have helped to accelerate the demise of the disc format. Much like mp3s are the preferred format for portable music players (or streaming), now as we have large increases in storage capacity and better compression, you have people who will fly somewhere, and load up several movies on their smartphone or laptop to watch during the trip.
It's all good.
I just had to laugh at lordsmurf's authoritative "prediction".
And on this subject....if I get my hands on SD 720 x 480 video that I truly want to archive in highest quality on a disc format, and I have to convert it to another format to burn to disc, I will convert to uncompressed .avi, then save it on blu-ray.
No, I will NOT change the resolution. As pointed out by the OP, that was not the original intent of this thread. I just realized one day, that if, say, I had a 100GB lossless .avi and had to compress to 4gb to burn to a DVD, that is compressing it approximately 25 times.
Whereas, if I compress it to approx. 25gb for a single-layer blu-ray, I am going to have better quality, as I can generally encode the blu-ray at a rate up to 40mb, way better than 8mb for DVD.
Sure, the spec for the encoding rate for DVD is 9 or 10MB if I recall correctly, but that does not mean all DVD players will play discs encoded at the top end of the spec. I have found in general that if I want to ensure a DVD I give to someone is to be playable in their player, I have to back off on the encoding rate a bit, and 8MB seems to work fine for maximum playability on the max number of consumer DVD players.
Since we do not yet have discs that can hold an entire 2-hour movie in .avi lossless format (even for SD), it seems there is always going to be some sort of compression necessary to burn to disc. Once I got a blu-ray burner, I realized I no longer had to compress video as heavily as we do for the DVD format when desiring best quality.
Considering you can get 128GB flash drives for 25 bucks now, and blu-ray players typically have USB ports which can accept them, die-hard video quality fanatics can load uncompressed .avi files onto them and watch them if they want absolute lossless quality.
Of course we have 4k ramping up. And then who knows what after that....onward to the ceiling of reality....
Anyone have any idea what "reality quality" is?
I mean, sorta like audio......even wav files are just digital audio samples at rates up to 96khz....what is the sample rate for true, lossless analog? Where it is the same quality as what you hear in real life? No nitpicking about "Well if you hear an mp3 you are hearing it in real life".
What is the true lossless rate for video? Where you watch TV, and it looks as real as your eye would perceive it in real life? With no "samples", zero compression....just real, unadulterated video quality as perceived by the brain through your eyes? And of course, we are limited by the current state of display technology.
From the above link: "8K display resolution is the successor to 4K resolution. Full HD (1080p) is the current mainstream HD standard, with TV manufacturers pushing for 4K to become a new standard by 2017. The feasibility of a fast transition to this new standard is often questionable in view of the absence of broadcasting resources.
As of 2015, few cameras have the capability to shoot video in 8K, with NHK being one of the only companies to have created a small broadcasting camera with an 8K image sensor. Sony and Red Digital Cinema Camera Company are among the others to be working on bringing a larger 8K sensor in more of their product range in coming years. Until major sources are available, 8K is unlikely to become a mainstream resolution but filmmakers are pushing for 8K cameras in its advantage to get better 4K footage."
And here are some very recent articles on this subject:
We live in interesting times.
Especially for us A/V geeks.