I'm sitting in front of a CRT computer monitor typing this, and I've tried to set up a custom interlaced resolution/refresh rate, but the video card simply won't let me do it. I can do it for the TV... Instead of selecting a refresh rate such as 50Hz, if I select 25Hz, I can then choose interlaced as an option, but I can't find a refresh rate which lets me do the same for the CRT monitor. If I select interlaced, it immediately switches back to progressive, so if there's a way to do it, I guess I'll need some specific instructions.
+ Reply to Thread
Results 31 to 52 of 52
-
-
For the record, if I run Windows in interlaced mode on the TV, it displays fine, but nothing is smooth. Even the mouse moves in a jerly motion rather than being fluid. If I grab a program window and try to move it across the screen it "bends" as it moves instead of remaining rectangular. Sort of like this:
-
-
Not only that, but fiddling with a video card's output parameters to make it possible for a monitor to display interlaced video all by itself is an exercise with no practical value.
If someone is using a computer to watch interlaced video, a good software player and progressive output from the video card will make it look a heck of a lot better than anything the monitor can do on its own.
Also, the usual reason a member asks questions about using computer monitors to display interlaced video is that they tried connecting their computer monitor to the HDMI output of their consumer electronics (satellite receiver, cable box, DVD player, media player, etc.) and had a disappointing result when they attempted to display interlaced output. There is no way to change the video output parameters for their consumer electronics device to make its interlaced video output more compatible with a computer monitor.Last edited by usually_quiet; 13th Mar 2013 at 20:34.
-
It will save bandwidth - static picture can have full spatial resolution (reduced by Kell factor), for motion video it have twice time resolution at a cost spatial resolution in vertical direction - this perfectly matching human psycho-visual system model where full resolution can be achieved for static or slowly moving objects and for fast moving objects spatial resolution is reduced.
From this point interlace saves bandwidth and provide best quality vs bandwidth.
I totally disagree, with regards to interlaced broadcast TV, where each and every frame was sent as TWO distinct NEW fields Every time
this has nothing to do with the OP, but if we are going to debate minutia , lets keep it in perspective and correct -
I'd be tempted to disagree with that. I've compared sending interlaced video to the TV and letting it de-interlace it, compared to letting the video card do it, and I can't tell the difference. In theory a computer monitor could de-interlace video the same way, they're just not designed to do so.
In context of the discussion though, pandy suggested CRT computer monitors should be able to accept an interlaced input. I've not found a way to get mine to do it (the video card won't let me), but if I understand correctly regarding how a CRT TV draws the picture on the screen, it refreshes every second scan-line, then the alternate scan lines, which duplicates the way the interlaced frames are being received. If a CRT computer monitor refreshes it's screen the same way (I don't know whether it does or whether it refreshes every scan line each time) then I guess in theory a CRT monitor could display interlaced video as a CRT TV would as it wouldn't need to de-interlace it.
I imagine the only way it'd work though, is if the monitor was using a 50Hz or 60Hz refresh rate, otherwise the refresh rate wouldn't match the frame rate. I run my CRT monitor at 85Hz as that seems to be the refresh rate by brain interprets as being flicker free, and the lowest I can choose is 60Hz. In theory getting a CRT monitor to display interlaced video may be possible, but in practice I've not found a way to do so. -
I didn't start debate over why Interlace exist - it exist and it is good to display interlace video in native way as there is no optimal way to correctly deinterlace video.
Bandwidth saving is a pure math - if you doubt in this then make calculation for your self - i will say once again - interlace can be seen as first compression method ever used that exploit human vision system characteristic and is quite well fitted with way how human see surrounding world. Thanks to interlace we can save half of video signal bandwidth (in analog domain).
This is scientific fact not subjective opinion.Last edited by pandy; 14th Mar 2013 at 03:35.
-
Briefly:
1. LCD can display interlace video (clear proof is in this topic),
2. CRT is a display technology and this technology is still used to display computer video,
3. Modern computers can output interlace video also for different resolutions than only TV.
This was my point - ask professional company not Wallmart. Remain part this dispute is OT and it is pure guess - clearly overinterpretation from your side about not verbalized intention of OP.
I like how You verbalizing and translate various OP intentions - consider life career in politics or as a lawyer. -
-
-
1. I don't see clear proof of that in this topic for LCD computer monitors, only assertions from you that it is true. Nobody has come forth to say that they have done it.
3. Except for you apparently, when members of this forum use the term "interlaced video", they mean NTSC, PAL or one of the standard 576i, 480i or 1080i video formats, not a non-standard interlaced video signal from their VGA card. ...and when a newbie member asks questions about whether a computer monitor can display interlaced video, they mean HDMI output from a consumer electronics device, not the output from their video card. LCD computer monitors don't typically display the interlaced output from a consumer electronics device correctly, if at all. My video card's standard user interface won't even permit me to output any of the standard interlaced resolutions to my LCD monitor because its E-EDID data indicates it does not support them.
The normal progressive output from a video card already works very well with every recent CRT or LCD computer monitor. That being true, whether someone succeeds or not, fiddling with video card settings using a third-party video card tweaker like PowerStrip in an attempt to find an interlaced mode that will display properly on a LCD or CRT computer monitor is a pointless intellectual exercise with no practical value.
In that case, your point is invalid, because this discussion is about what ordinary computer monitors can do. "Professional companies" don't sell ordinary computer monitors. They sell professional grade specialty products for video work, or industrial use, which have different characteristics and sometimes different video connections than a regular CRT computer monitor. None of the well-known manufacturers of computer monitors have made any ordinary CRT computer monitors for several years. If someone wants one now, they can no longer buy one new at an electronics/computer specialty store or and online electronics specialty retailer. They have to make a special effort to obtain one, and either pay through the nose for old stock, or take their chances with a second-hand monitor
Of course I "translate various OP intentions" when I read their posts, as does any other person with normal cognition. I interpret the words the OP has written for myself and think about why they might be asking the question to decide what that individual really wants to know. If you only interpret statements in an absolutely literal, one-dimensional sense, then consider seeing a psychologist for evaluation.Last edited by usually_quiet; 14th Mar 2013 at 15:09.
-
The CRT monitor is a HP P1230.
You appear to be correct about that, which has changed my view of the world somewhat. I guess when I was playing around with this ages ago, I managed to get it wrong. The reason I got it wrong is no doubt due to the way the various refrsh rates are listed in Windows Display Properties. At the time I thought it was "backwards", so to speak, but just put it down to the way Microsoft describes the refresh rates. They do describe lower refresh rates (24Hz etc) as being interlaced. Below are the standard options for the connection to the TV. The CRT monitor doesn't include any "interlaced" refresh rates.
When I select one of those "interlaced" refresh rates, the TV displays 24Hz, 25Hz or 30Hz, and I'm now fairly confident the input to the TV is progressive, despite the "interlaced' label. Maybe I'm missing the obvious, but I don't understand why they're labelled the way they are. The fact that movement is jerky when using the lower refresh rates was something I attributed to the video being interlaced, but I guess it's simply because the refresh rate is fairly low.
Back in the Nvida Control Panel, the same settings are available as presets for the TV, but the lower refresh rates are no longer labelled as "interlaced".
So from there I go into the customising section and try to set an interlaced output. The problem though, is it won't let me. The option is there, but regardless of the custom refresh rate/resolution for the TV or the CRT I choose, so far every time I've selected "interlaced" in the drop down menu, it immediately switches back to progressive. It kind of makes me inclined to assume my video card simply isn't capable of outputting interlaced video? I didn't want to mess around with manually changing timings, as I have no idea what I'm doing.
So it does look like I was party wrong.... I don't seem to have a way to send the TV interlaced video from the PC and get the TV to de-interlace it. I guess when I was playing around with this ages ago, I made a bad assumption.
The TV is quite fussy regarding the resolution/refresh rate. It seems to have combinations which connect to the PC one way, but then are converted to a different resolution internally. I don't know why exactly but if I choose 1600x900 as a resolution, the TV happily switches and Reclock seems happy the display is 1600x900, but the TV itself displays 1680x1050. The CRT on the other hand, seems to happily do whatever I tell it to, except when it comes to being able to set the output as interlaced.
I'm happy to play around a little more but maybe outputting interlaced video is something my 8600GT video card can't do? Maybe it's progressive or progressive? Is the ability to output interlaced signal something which video cards should be able to do? I'd guess so if you use their traditional "TV output", if they have one, but I can't find a way to make it happen using DVI or VGA. -
One other question for which I still don't have an answer..... how does a CRT computer monitor refresh it's screen? My understanding is a CRT TV does it by refreshing it half at a time, every second scan line one time, then the alternative scan lines the next time. I thought a CRT computer monitor refreshed it's whole screen in one go (every scan line from top to bottom), hence it not being able to display interlaced video correctly as a CRT TV would, but have I got that wrong?
-
Hi All,
There is long discussion for my post. And Am not understood some of the points.
My concerns are general.
Basically we have 2 video displays.
1. Progressive
2. Interlace
In progressive case, If we take 480p resolution,
Frequency = 800(Total lines/frame)*525(Total pixels/line)*60(frame rate) = 25.2 Mhz.
As per my understand there are 2 conditions for interlace comes into picture.
1. In CRT TV's people find that, when the beam writing of last lines, the Already wrote first lines become not good(lost some shade). When again it writing the first lines, the user can find the difficult to see the picture. So people want to reduce the number of lines.
2. With reducing number of lines we can reduce the bandwidth also.
So Frequency in Interlace = 400(Total lines/frame)*525(Total pixels/line)*60(frame rate) = 12.6 Mhz
Even with this also human eye I can see the good image.
My Concerns are
1. As for as I know only CRT tv's can output interlaced video. Can CRT tv output progressive video also ???
2. Latest TV's (LCD,LED, except CRT) why don't use the concept of interlaced video ?? Why they are only progressive by native.
3. Can CRT computer monitors output intelaced video ??
4. Computers monitor's(except CRT) also don't use the concept of interlaced video ?? Why they are progressive by native ??
5. 35mm/70mm screens in theatres, what it supports ??
Am I found that in Wikipedia, Only CRT's displays are capable of interlaced video. And Why others are native by default ??
And there is no future for interlaced video. So, Can't use interlace, greater 1080i resolution ?? -
it is done sequentially as described by you with one exception - lines from first field are draw with some offset from progressive position (+0.5 line), lines from second field are draw with exactly opposite (-0.5 line) offset from progressive line position.
Theoretically this provide situation where lines from both fields have complementary position and together they create perfect picture - some imperfections make things worse and this is described by something named "Kell factor" - vertical resolution is usually worse for interlace than for progressive and also this factor is related to technology used by display (optical resolution is usually worse than specified).
CRT display signal in analog way - it is synchronized by signal itself and it will follow video signal - as always there are some limits - in analog displays this limit is related to components and capabilities (thus purely analog display always trying to display signal and when signal is not correct it can be even destroyed), for digitally controlled analog displays there is additional circuit that verify parameters for incoming signal and if Vertical or Horizontal frequencies are out of specified range then it will refuse to display such signal - instead some error message is displayed. However it is very important to mention that bandwidth is not limited in any way so in theory amount of pixels in line is limited only by video source not by display (display have own limitations but rule is similar to well known oversampling).
CRT are almost always analog (even when they are digitally controlled so whole timing is also analog so it is flexible and there is no difference between progressive or not video) - there is some very limited amount digital CRT - they work (signal processing) as LCD or PDP thus they behave different. -
Ad. 1 - yes - CRT are capable to display both types video signal - they are so called : sequential type display - video is displayed as series of lines.
Ad. 2 - they are opposite to sequential - all those modern display technologies are memory type displays - they can display whole picture at once (but due technology limitations - drivers, buses etc they also use multiplexing or very combined/complex addressing schemes)
Ad.1,2. There is very good analogy between sequential and memory - progressive type of display - imagine two types memories - first serial register (LIFO.FIFO) and RAM - CRT are like serial memory - there is no direct access to each pixel - display is capable only to display whole sequence of pictures (some CRT can have direct access to pixels but those devices are not used to display pictures but for example to display special information or for other purposes - for example oscilloscopes, very fast analog to digital converters etc), LCD or PDP are like RAM - each pixel can be addressed independently and controlled, for color displays even sometimes there is direct access to sub pixels and this is used to improve quality (rasterization).
Ad. 3 if i understand you correctly then yes
Ad. 4 interlace have own limitations and there is no sense to use it especially with displays where each pixel can be accessed independently.
Ad. 5. Cinema use different principle to create picture and this is purely progressive (even more than PDP or LCD or DLP etc)
With CRT (with almost all CRT's) you can use any resolution as you not going over limits for V and H synchronization frequency. With interlace you can double amount of lines (with typical interlace scheme with fields but there are very rare displays that use more than 2 fields - mostly used in medicine visualization - they can use 3, 4 fields thus amount of lines is 3 o 4 times bigger at a cost of refresh rate - usually those displays use phosphor with very long persistency or even special memory CRT tubes (tube remember picture until picture is erased by special electrical signal - those tubes in theory can use very complex interlace to provide extremely high resolution picture)
---
after some time
---
http://www.blurbusters.com/zero-motion-blur/lightboost/
Perhaps with this technique, interlace can be simulated correctly also on progressive displays - not sure - need to think on this.Last edited by pandy; 15th Mar 2013 at 08:56.
-
Horizontal frequency 30 to 140 kHz
Vertical Refresh rate 50 to 160 Hz
Use this one to
http://www.epanorama.net/faq/vga2rgb/calc.html
stick with H and V sync freq, double number lines - it will keep interlace within limits (yes graphics card must output video resized twice in vertical direction but this can be pure repetition - line doubling and it should be not a problem - bandwidth remain same as for progressive) - look at the good side - twice amount of lines for nothing (almost).
--
there is plenty available timing calculators
http://umc.sourceforge.net/Last edited by pandy; 15th Mar 2013 at 04:39.
-
I'll give it a go tomorrow and report back. I don't want to do it now because the PC's in the middle of a long conversion and I'd like to let it finish before I try something that might make it crash..... or the monitor explode.
So does anyone know why Windows lists refresh rates for the TV such as 24Hz or 25Hz as interlaced?Last edited by hello_hello; 15th Mar 2013 at 06:41.
-
-
Sorry, that was a typo. I've changed it.
I'm trying to understand though, as per the screenshot in my earlier post, why Windows lists the refresh rates of 24Hz, 25Hz and 30Hz in Display Properties as "interlaced". I tried them and they certainly appear to be progressive, and if I select 24Hz (as an example) the TV displays 24Hz when it switches the refresh rate. -
Modern CRT are protected against incorrect parameters - don't worry - after setting new mode when there is no video just wait (at least on Windows) 20 sec.
Perhaps becouse developer was confused as many people and for him anything lower than 60(50) Hz is interlace.
btw
as 24Hz mode is simply unrealistic (to low V sync freq) go for 48 or even better 72Hz mode (exactly as in cinema when source of light is interupted 2 or 3 times per each movie frame to reduce perceived flickering).
Similar Threads
-
MPC won't display video on second monitor.
By zeek543 in forum Software PlayingReplies: 4Last Post: 19th Sep 2013, 21:21 -
Can an interlaced file appear not to be interlaced on the monitor?
By rcavanah in forum Authoring (DVD)Replies: 5Last Post: 10th Sep 2011, 22:09 -
Computer Doesn't boot up monitor when turned on
By kerrex_2006 in forum ComputerReplies: 10Last Post: 23rd Dec 2009, 11:20 -
interlaced player for computer monitor
By CuriousYellow in forum Software PlayingReplies: 1Last Post: 13th May 2009, 13:41 -
Computer Back on from Sleep Mode, But Monitor Won't Display
By kerrex_2006 in forum Newbie / General discussionsReplies: 5Last Post: 20th Mar 2008, 14:58