"I really wish Laserdisc [were] more like vinyl [in] that the better the equipment you played it on, the better the video you got out of it but I'm afraid the video is the video and it's stuck in that quality" - Mat Taylor (Techmoan)
There were experiments with television with over 1,000 lines of resolution even way back during WWII, see this page. Video recording was first introduced in 1956, this was the first time that pictures could recorded on a reusable medium, videotape was also the first medium that could record anything photographic that could be viewed immediately.
Throughout the era of analog video, film was used to record many T.V shows shot on a medium not to be reused, that is, when the medium used for shooting the action was intended to be archived. This gave more flexibility in editing - film can be edited by cutting and splicing and whether color or black and white and can be edited between any two frames. The edited content would then be copied using a telecine to some video format for playback by television stations during broadcast and also for video releases for rent in video shops or sale in video retailers.
Analog video does exist in higher definitions than were used for ever used for all analog (uncompressed) broadcasts(*) but until the 21st century, when digital cinema became viable, not only was anything made for cinema (mostly feature films) shot and edited on film but back then, movie studios were forced to copy them to additional film strips and sell or lease them to cinemas that would run them through projectors with very bright lights shining through the film and shutters flipping the projected beam on and off, it had to be off between frames.(**) They did not copy to some high definition analog video format and distribute it to cinemas that would play them back through high definition video projectors.
(*)Remember, those early high definition broadcasts in Japan were not using an all analog system but a partly digital system with some sort of compression.
(**) Usually, the projectors are playing back 24 frames per second and in that case, each frame has to be shown multiple times. However, modern flat panel displays and video/data projectors are continuous, going straight from one frame to the next without the light being interrupted.
+ Reply to Thread
Results 1 to 24 of 24
Last edited by Xanthipos; 3rd Jan 2022 at 11:04. Reason: right word
Is there a technical reason why 4k analog video, let along 8k, does not exist?
By the way, with fast enough pulse width modulation, the light may well be continuous due to persistence of screen glow.
I can still relatively easily start my CRT monitors and program new video mode so i can display 2k resolution video signal in analog way - can't easily produce higher resolution video due of signal source limitation (but these can be relatively easily overcomed).
Btw laser printers may be considered as alternative ultra high resolution video sources - for 600 DPI you may think on laser printer as on 4k video signal - if you replace paper by mirror able to perform scan in vertical direction then you may start displaying lines so laser beamers may be considered as analog video sources nowadays.
Analog video in what was then standard definition already has a bandwidth of over 4MHz, doubling the number of lines without changing aspect ratio or framerate means quadrupling that bandwidth. All feasible video recording in the pre-digital era used frequency modulation, it was the only feasible way to record any video signal, and so recording an analog video signal with four times the bandwidth might well need more than four times the writing speed.
I also noted that cinema movies, right until digital cinema was viable, were not only shot and edited on film but were copied to film and cinemas ran them through projectors. So (slower) 35mm film and especially 70mm film apparently has a higher level of detail than any feasible analog video recording.
But somehow digital video can be 4k or even 8k. Keep in mind that digital video in old standard definition has a greater signal bandwidth than analog video with the same line and framerate.
No, according to the page about the use of pulse width modulation, L.C.D monitors have fluorescent backlights. These each have a layer of phosphor to produce white light. A high enough switching frequency relative to the length of the phosphor persistence will mean continuous light.
The closest to your wishful thinking was saying CFL flicker isn't as sharp as LED flicker because CFL's have, as you said, a phosphor persistence thing going on.
My Benq monitor is LED edge backlit. Is CFL backlighting used much these days? Especially for LCD TVs boasting local backlight dimming. I'd assume they'd have to be LED.
And of course there's Pulse-width modulation (PWM) in OLED displays.
Mind you, assuming they're really flicker free, flicker free backlighting is becoming a thing. I assume to solve the flicker problem that doesn't exist.
Also, laserdisc, even when played almost verbatim, is hampered by being a composite video format. Every video signal, even in the analog domain, follows a technical standard, analog audio does not work like that.
(*)I'm not sure how the two different standards compare on extended play discs, but a standard play N.T.S.C disc stores only 525 lines per turn while a PAL standard play disc must store 625 lines per turn - in both cases, all the blanking intervals need to be recorded. Storing one frame per turn allows an all analog trick-play.
Lots of wrong wrong information and poor understanding so far....
Firstly, PWM is a digital control system, it MAY be used to control backlight levels but it isn't practical to use it for video levels for two reasons: (1.) it requires the PWM period to be divided into shorter time slots, the number of slots decides the resolution and hence the number of different light levels that can be attained, it therefore has limits in the luminance and chrominance levels it can produce and (2.) in order that the 'slots' can be occupied (=1) or empty (=0) fast enough to represent a pixel level, they need to repeat very rapidly, typically eight to ten times faster than the pixel duration. This results in a bandwidth requirement that is difficult to manage in domestic equipment and certainly not practical to store or transmit.
Secondly, it is true that fluorescent backlighting has an afterglow that helps to level the luminosity while the AC drive voltage drops below the gas ionization threshold but that also limits the response time to brightness changes so dynamic lighting changes are rarely used. It isn't true that LED backlighting is necessarily faster although in most cases it is. White LEDs also produce light by fluorescence but instead of the initial energy coming from ionized gas, it comes from short wavelength emissions from a diode junction. Although there are optimizations, different color high brightness LEDs are basically the same construction but with different color phosphors to emit the desired wavelength. Note that the LEDs used in CD/DVD/BD players do not use phosphors, the laser light is produced by a different process that doesn't limit their speed.
Going back to the original point, the limit of analog definition is the bandwidth it takes to store and transmit it. There is finite available bandwidth for transmission over the air and many users wanting to use it. The reason for going digital is commercial, it allows more stations to be packed into the same space (multiplex) without losing too much perceptual quality. That in turn releases more air space for other users, particularly for mobile communication where demand is very high. Even for digitally encoded TV broadcasts, bandwidth still increases with bit rate, that's why more SD channels can be fitted in a multiplex than HD ones. The shortage of band space is the reason more compaction of the data is desirable, it allows more channels to be squeezed into the available space, hence analog moving to MPEG and then H264/H265 and so on.
It is fairly easy to calculate analog bandwidth requirements and it makes it obvious where the technical limitations come about. Think of the number of pixels in a picture, it is the number vertically multiplied by the number horizontally. For a standard display with red/green/blue filters or OLEDs you have to multiply that by three (sometimes four, depending on type of OLED). Now take that figure and multiply it by how many times per second the picture has to update, this gives you the highest frequency that will be present in the analog signal. Next take in to account that to store or convey that signal you need a wider bandwidth than that highest frequency, technically it has to be at least twice the highest frequency but even more is preferable. You start to run into massive numbers and impractical electronics and huge storage needed to hold recordings.
But suppose the horizontal resolution depends on the number of lines and the aspect ratio. If there are 480 lines and the aspect ratio is 4:3, there are 640 pixels per line. We could treat an analog video signal as if there are. 576 lines (same aspect ratio)= 768 pixels per lines.
Multiply the former by a 60Hz refresh rate and the latter by a 50Hz refresh rate should give the bandwidth standard definition analog video would need. Well, it's not quite that simple because of blanking intervals.
The bandwidth of pulse code modulated (uncompressed digital) video also depends on the number of binary digits representing each pixel.
I'm not actually sure the point of this thread as it seems to be a mix of assumptions and results of reading various articles. The problem with analogue video is the amount of bandwidth required. I started with analogue video over 20 years ago and standard PAL SD video required a lot of work to transfer to computer due to requiring a sustained data rate of around 46 Mb per second, switching to DV with the relatively lower rate of a mere 13Gb per hour meant I didn't have to turn most things off and play around with IRQ settings to achieve the require rate. But, that was for PAL SD, 720x576, so any increase in resolution would have meant a huge increase in data rate needed to transfer it. Way beyond what a computer was capable of and what a display of the time was also capable of displaying.
The other problem with analog - any analog system - is error. Error that is nonrecoverable and cumulative. Noise, if you will, but not only noise.
And the probability of error goes up with the bandwidth requirement. Right there is why nobody would want to try 4k analog.
This is purely academic discussion as i pointed earlier - issue is not analog video per se but analog display capable to display such analog video.
You can store 20 or perhaps even 30MHz on magnetic tape as at least 12 .. 14MHz VTR's was available at the end of 80's.
btw hope you are aware of the old MUSE (NHK) and HD-MAC (EBU) systems - they are largely analog HD video systems.
But to be honest i don't see your point - theoretically it is possible but it will be (when compared to modern digital technology) way more inferior type of video.
- The base specs of the video (NTSC, PAL), absolutely.
- The quality of the playback, no.
Player makes a difference here. Something I learned from serious LD fans decades ago, and you should research the topic yourself. It really is amazing how rare (and expensive) some of the best players are. You can't just use an random LD player, unless you just want any random quality output.
If you argue this well-known situation, the rest of your entire argument is suspect, specious even, as some other posts apparently are saying outright.
Analog "resolution" depends on the analog (spectral) bandwidth (in Hz) of the transmission channel (air, cable, optical fibre ....) or the bearer (tape, disc ....) and the level of disturbance ("noise") which is inevitably present on that channel or bearer *). Constraints are technical feasibility and/or cost. Standards like NTSC etc. take these constraints into account and are always a compromise.
Once an analog signal is converted to digital, its bitrate or file size (in the digital realm) can be reduced by signal processing and lossy compression.
Compression existed in the analog world as well btw. Examples are video interlacing and audio companding.
*) More precisely it is the ratio between the useful signal level and the disturbance level.
Last edited by Sharc; 10th Jan 2022 at 03:41.
Also noise, signal and bandwidth relations in analog world can be more tricky than simple linear model.
Some limitations of the analog domain can be mitigate by proper signal coding.
Laserdisc is also composite video format, and this hampers it even when played on the best equipment. It can cause color-fire (also known as cross-color) and dot crawl, maybe even on the best players.
Of course the player is going to make a difference. Think the same as hi-fi audio. Put a CD in a cheap Chinese player and it will sound OK, put the same CD in a high end player and it will sound better. The encoded source signal is the same but how the player deals with it along with other variables, quality of the interconnecting cables, quality of the display you are using, etc will all make a difference to the end result.
If you have a ton of cash you can R&D your own analog 4K gear, For the rest of the sane world analog video does not make any economical or technical sense at all, Transmitting and storing huge amounts of video data without digitizing it and compressing it wasn't possible until mpeg and DV came along, Your analog video reached its limit at HD for muse and W-VHS three decades ago and was only for the wealthy elite who could afford to buy the gear. MPEG-2 brought HD for the average household in 1080i (TV/Cable/D-VHS/HDV camcorders) and 1080p (Blu-ray/HD-DVD, HD camcorders), compressed but trouble free compared to muse and W-VHS. Now with 4K and 8K the need for a better digital codecs is even greater to improve the quality of the image representation while keeping the files manageable and transmittable.
If you think of digital as a stairstep signal we all know that has been long debunked for audio let alone video, Remember this? It never gets old.
Ok, you asked for limits then i will describe limits of analog video.
Limits are obvious and related to processing algorithms - some algorithms (all algorithms using temporal domain) require video memory i.e. somehow analog video must be stored, sometimes for relatively long time, sometimes relatively long video must be stored for substantial amount of time - from practical perspective you can imagine analog memories using discrete time but even today semiconductor technology will be incapable to provide required signal integrity.
There is some analogy between flash memories using multilevel memory cell - overall memory cell density is incapable to deliver desired signal integrity.
Obviously limits of the analog video signal are bond to analog nature of this signal - digital domain offer some tricks and workarounds to overcome such limitations - at some point analog signal could be only played and transmitted as in first years of radio...