VideoHelp Forum




+ Reply to Thread
Page 2 of 2
FirstFirst 1 2
Results 31 to 52 of 52
  1. I'm sitting in front of a CRT computer monitor typing this, and I've tried to set up a custom interlaced resolution/refresh rate, but the video card simply won't let me do it. I can do it for the TV... Instead of selecting a refresh rate such as 50Hz, if I select 25Hz, I can then choose interlaced as an option, but I can't find a refresh rate which lets me do the same for the CRT monitor. If I select interlaced, it immediately switches back to progressive, so if there's a way to do it, I guess I'll need some specific instructions.
    Quote Quote  
  2. For the record, if I run Windows in interlaced mode on the TV, it displays fine, but nothing is smooth. Even the mouse moves in a jerly motion rather than being fluid. If I grab a program window and try to move it across the screen it "bends" as it moves instead of remaining rectangular. Sort of like this:
    Name:  bend.JPG
Views: 6227
Size:  3.4 KB
    Quote Quote  
  3. Banned
    Join Date
    Oct 2004
    Location
    New York, US
    Search Comp PM
    Originally Posted by usually_quiet View Post
    I don't I need professional help with my hardware or software. They are working exactly as their makers intended them to work. Trying to make them do something the manufacturer does not recommend or allow under normal circumstances would be a truly stupid waste of my time.
    Ditto.
    Last edited by sanlyn; 26th Mar 2014 at 05:44.
    Quote Quote  
  4. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by sanlyn View Post
    Originally Posted by usually_quiet View Post
    I don't I need professional help with my hardware or software. They are working exactly as their makers intended them to work. Trying to make them do something the manufacturer does not recommend or allow under normal circumstances would be a truly stupid waste of my time.
    Ditto.
    Not only that, but fiddling with a video card's output parameters to make it possible for a monitor to display interlaced video all by itself is an exercise with no practical value.

    If someone is using a computer to watch interlaced video, a good software player and progressive output from the video card will make it look a heck of a lot better than anything the monitor can do on its own.

    Also, the usual reason a member asks questions about using computer monitors to display interlaced video is that they tried connecting their computer monitor to the HDMI output of their consumer electronics (satellite receiver, cable box, DVD player, media player, etc.) and had a disappointing result when they attempted to display interlaced output. There is no way to change the video output parameters for their consumer electronics device to make its interlaced video output more compatible with a computer monitor.
    Last edited by usually_quiet; 13th Mar 2013 at 20:34.
    Quote Quote  
  5. Member
    Join Date
    Jan 2007
    Location
    United States
    Search Comp PM
    It will save bandwidth - static picture can have full spatial resolution (reduced by Kell factor), for motion video it have twice time resolution at a cost spatial resolution in vertical direction - this perfectly matching human psycho-visual system model where full resolution can be achieved for static or slowly moving objects and for fast moving objects spatial resolution is reduced.

    From this point interlace saves bandwidth and provide best quality vs bandwidth.
    maybe concerning computers and avi files, where only the difference between frames is stored and only the pixels to be changed are sent

    I totally disagree, with regards to interlaced broadcast TV, where each and every frame was sent as TWO distinct NEW fields Every time

    this has nothing to do with the OP, but if we are going to debate minutia , lets keep it in perspective and correct
    Quote Quote  
  6. Originally Posted by usually_quiet View Post
    If someone is using a computer to watch interlaced video, a good software player and progressive output from the video card will make it look a heck of a lot better than anything the monitor can do on its own..
    I'd be tempted to disagree with that. I've compared sending interlaced video to the TV and letting it de-interlace it, compared to letting the video card do it, and I can't tell the difference. In theory a computer monitor could de-interlace video the same way, they're just not designed to do so.

    In context of the discussion though, pandy suggested CRT computer monitors should be able to accept an interlaced input. I've not found a way to get mine to do it (the video card won't let me), but if I understand correctly regarding how a CRT TV draws the picture on the screen, it refreshes every second scan-line, then the alternate scan lines, which duplicates the way the interlaced frames are being received. If a CRT computer monitor refreshes it's screen the same way (I don't know whether it does or whether it refreshes every scan line each time) then I guess in theory a CRT monitor could display interlaced video as a CRT TV would as it wouldn't need to de-interlace it.

    I imagine the only way it'd work though, is if the monitor was using a 50Hz or 60Hz refresh rate, otherwise the refresh rate wouldn't match the frame rate. I run my CRT monitor at 85Hz as that seems to be the refresh rate by brain interprets as being flicker free, and the lowest I can choose is 60Hz. In theory getting a CRT monitor to display interlaced video may be possible, but in practice I've not found a way to do so.
    Quote Quote  
  7. Originally Posted by theewizard View Post
    It will save bandwidth - static picture can have full spatial resolution (reduced by Kell factor), for motion video it have twice time resolution at a cost spatial resolution in vertical direction - this perfectly matching human psycho-visual system model where full resolution can be achieved for static or slowly moving objects and for fast moving objects spatial resolution is reduced.

    From this point interlace saves bandwidth and provide best quality vs bandwidth.
    maybe concerning computers and avi files, where only the difference between frames is stored and only the pixels to be changed are sent

    I totally disagree, with regards to interlaced broadcast TV, where each and every frame was sent as TWO distinct NEW fields Every time

    this has nothing to do with the OP, but if we are going to debate minutia , lets keep it in perspective and correct
    I didn't start debate over why Interlace exist - it exist and it is good to display interlace video in native way as there is no optimal way to correctly deinterlace video.

    Bandwidth saving is a pure math - if you doubt in this then make calculation for your self - i will say once again - interlace can be seen as first compression method ever used that exploit human vision system characteristic and is quite well fitted with way how human see surrounding world. Thanks to interlace we can save half of video signal bandwidth (in analog domain).
    This is scientific fact not subjective opinion.
    Last edited by pandy; 14th Mar 2013 at 03:35.
    Quote Quote  
  8. Originally Posted by usually_quiet View Post
    The op asked "Why Computer monitor doesn't have capability to display interlaced video ??" The OP also asked " How computer monitor is different from CRT monitor ?" That sounds like the OP wants information about something current, concrete and practical regarding LCD monitors, not a theoretical discussion of what would be possible if typical consumer computer monitors were made differently than they are now.
    Briefly:

    1. LCD can display interlace video (clear proof is in this topic),
    2. CRT is a display technology and this technology is still used to display computer video,
    3. Modern computers can output interlace video also for different resolutions than only TV.

    Originally Posted by usually_quiet View Post
    No, I can't get a CRT monitor without problems. Nobody sells newly made consumer CRT computer monitors here anymore. Thrift shops don't even want them. That has been the case for a few years. The only new CRT monitors for sale now are specialty items intended for manufacturers to use in cash registers, or vending kiosks, or for professional video work, and other non-consumer applications. They are not much use to anyone who wants a display for their PC. Plus, the OP eventually made it clear that wanted to know why non-CRT computer monitors can't display interlaced video.

    I don't I need professional help with my hardware or software. They are working exactly as their makers intended them to work. Trying to make them do something the manufacturer does not recommend or allow under normal circumstances would be a truly stupid waste of my time.
    This was my point - ask professional company not Wallmart. Remain part this dispute is OT and it is pure guess - clearly overinterpretation from your side about not verbalized intention of OP.

    I like how You verbalizing and translate various OP intentions - consider life career in politics or as a lawyer.
    Quote Quote  
  9. Originally Posted by hello_hello View Post
    For the record, if I run Windows in interlaced mode on the TV, it displays fine, but nothing is smooth. Even the mouse moves in a jerly motion rather than being fluid. If I grab a program window and try to move it across the screen it "bends" as it moves instead of remaining rectangular. Sort of like this:
    Image
    [Attachment 16727 - Click to enlarge]
    set 120 - 150i
    Quote Quote  
  10. Originally Posted by pandy View Post
    set 120 - 150i
    I'll need to borrow your TV to test it. Mine has a maximum refresh rate of 60Hz (or 30i).

    So how do I get my CRT to accept an interlaced output from the PC?
    Quote Quote  
  11. Originally Posted by hello_hello View Post
    Originally Posted by pandy View Post
    set 120 - 150i
    I'll need to borrow your TV to test it. Mine has a maximum refresh rate of 60Hz (or 30i).

    So how do I get my CRT to accept an interlaced output from the PC?
    Nope - 60Hz progressive means 120i - what CRT we talking now? Please provide model and brand.
    Quote Quote  
  12. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by pandy View Post

    Briefly:

    1. LCD can display interlace video (clear proof is in this topic),
    2. CRT is a display technology and this technology is still used to display computer video,
    3. Modern computers can output interlace video also for different resolutions than only TV.
    1. I don't see clear proof of that in this topic for LCD computer monitors, only assertions from you that it is true. Nobody has come forth to say that they have done it.

    3. Except for you apparently, when members of this forum use the term "interlaced video", they mean NTSC, PAL or one of the standard 576i, 480i or 1080i video formats, not a non-standard interlaced video signal from their VGA card. ...and when a newbie member asks questions about whether a computer monitor can display interlaced video, they mean HDMI output from a consumer electronics device, not the output from their video card. LCD computer monitors don't typically display the interlaced output from a consumer electronics device correctly, if at all. My video card's standard user interface won't even permit me to output any of the standard interlaced resolutions to my LCD monitor because its E-EDID data indicates it does not support them.

    The normal progressive output from a video card already works very well with every recent CRT or LCD computer monitor. That being true, whether someone succeeds or not, fiddling with video card settings using a third-party video card tweaker like PowerStrip in an attempt to find an interlaced mode that will display properly on a LCD or CRT computer monitor is a pointless intellectual exercise with no practical value.


    Originally Posted by pandy View Post
    This was my point - ask professional company not Wallmart.
    In that case, your point is invalid, because this discussion is about what ordinary computer monitors can do. "Professional companies" don't sell ordinary computer monitors. They sell professional grade specialty products for video work, or industrial use, which have different characteristics and sometimes different video connections than a regular CRT computer monitor. None of the well-known manufacturers of computer monitors have made any ordinary CRT computer monitors for several years. If someone wants one now, they can no longer buy one new at an electronics/computer specialty store or and online electronics specialty retailer. They have to make a special effort to obtain one, and either pay through the nose for old stock, or take their chances with a second-hand monitor


    Originally Posted by pandy View Post
    I like how You verbalizing and translate various OP intentions - consider life career in politics or as a lawyer.
    Of course I "translate various OP intentions" when I read their posts, as does any other person with normal cognition. I interpret the words the OP has written for myself and think about why they might be asking the question to decide what that individual really wants to know. If you only interpret statements in an absolutely literal, one-dimensional sense, then consider seeing a psychologist for evaluation.
    Last edited by usually_quiet; 14th Mar 2013 at 15:09.
    Quote Quote  
  13. Originally Posted by pandy View Post
    Nope - 60Hz progressive means 120i - what CRT we talking now? Please provide model and brand.
    The CRT monitor is a HP P1230.

    You appear to be correct about that, which has changed my view of the world somewhat. I guess when I was playing around with this ages ago, I managed to get it wrong. The reason I got it wrong is no doubt due to the way the various refrsh rates are listed in Windows Display Properties. At the time I thought it was "backwards", so to speak, but just put it down to the way Microsoft describes the refresh rates. They do describe lower refresh rates (24Hz etc) as being interlaced. Below are the standard options for the connection to the TV. The CRT monitor doesn't include any "interlaced" refresh rates.

    Name:  TV.gif
Views: 4346
Size:  17.2 KB

    When I select one of those "interlaced" refresh rates, the TV displays 24Hz, 25Hz or 30Hz, and I'm now fairly confident the input to the TV is progressive, despite the "interlaced' label. Maybe I'm missing the obvious, but I don't understand why they're labelled the way they are. The fact that movement is jerky when using the lower refresh rates was something I attributed to the video being interlaced, but I guess it's simply because the refresh rate is fairly low.

    Back in the Nvida Control Panel, the same settings are available as presets for the TV, but the lower refresh rates are no longer labelled as "interlaced".

    Click image for larger version

Name:	nvidia.gif
Views:	1873
Size:	49.8 KB
ID:	16751

    So from there I go into the customising section and try to set an interlaced output. The problem though, is it won't let me. The option is there, but regardless of the custom refresh rate/resolution for the TV or the CRT I choose, so far every time I've selected "interlaced" in the drop down menu, it immediately switches back to progressive. It kind of makes me inclined to assume my video card simply isn't capable of outputting interlaced video? I didn't want to mess around with manually changing timings, as I have no idea what I'm doing.

    Click image for larger version

Name:	custom.gif
Views:	1886
Size:	23.3 KB
ID:	16752

    So it does look like I was party wrong.... I don't seem to have a way to send the TV interlaced video from the PC and get the TV to de-interlace it. I guess when I was playing around with this ages ago, I made a bad assumption.

    The TV is quite fussy regarding the resolution/refresh rate. It seems to have combinations which connect to the PC one way, but then are converted to a different resolution internally. I don't know why exactly but if I choose 1600x900 as a resolution, the TV happily switches and Reclock seems happy the display is 1600x900, but the TV itself displays 1680x1050. The CRT on the other hand, seems to happily do whatever I tell it to, except when it comes to being able to set the output as interlaced.

    I'm happy to play around a little more but maybe outputting interlaced video is something my 8600GT video card can't do? Maybe it's progressive or progressive? Is the ability to output interlaced signal something which video cards should be able to do? I'd guess so if you use their traditional "TV output", if they have one, but I can't find a way to make it happen using DVI or VGA.
    Quote Quote  
  14. One other question for which I still don't have an answer..... how does a CRT computer monitor refresh it's screen? My understanding is a CRT TV does it by refreshing it half at a time, every second scan line one time, then the alternative scan lines the next time. I thought a CRT computer monitor refreshed it's whole screen in one go (every scan line from top to bottom), hence it not being able to display interlaced video correctly as a CRT TV would, but have I got that wrong?
    Quote Quote  
  15. Hi All,

    There is long discussion for my post. And Am not understood some of the points.
    My concerns are general.
    Basically we have 2 video displays.
    1. Progressive
    2. Interlace

    In progressive case, If we take 480p resolution,
    Frequency = 800(Total lines/frame)*525(Total pixels/line)*60(frame rate) = 25.2 Mhz.

    As per my understand there are 2 conditions for interlace comes into picture.
    1. In CRT TV's people find that, when the beam writing of last lines, the Already wrote first lines become not good(lost some shade). When again it writing the first lines, the user can find the difficult to see the picture. So people want to reduce the number of lines.
    2. With reducing number of lines we can reduce the bandwidth also.
    So Frequency in Interlace = 400(Total lines/frame)*525(Total pixels/line)*60(frame rate) = 12.6 Mhz

    Even with this also human eye I can see the good image.

    My Concerns are
    1. As for as I know only CRT tv's can output interlaced video. Can CRT tv output progressive video also ???
    2. Latest TV's (LCD,LED, except CRT) why don't use the concept of interlaced video ?? Why they are only progressive by native.
    3. Can CRT computer monitors output intelaced video ??
    4. Computers monitor's(except CRT) also don't use the concept of interlaced video ?? Why they are progressive by native ??
    5. 35mm/70mm screens in theatres, what it supports ??

    Am I found that in Wikipedia, Only CRT's displays are capable of interlaced video. And Why others are native by default ??
    And there is no future for interlaced video. So, Can't use interlace, greater 1080i resolution ??
    Quote Quote  
  16. Originally Posted by hello_hello View Post
    One other question for which I still don't have an answer..... how does a CRT computer monitor refresh it's screen? My understanding is a CRT TV does it by refreshing it half at a time, every second scan line one time, then the alternative scan lines the next time. I thought a CRT computer monitor refreshed it's whole screen in one go (every scan line from top to bottom), hence it not being able to display interlaced video correctly as a CRT TV would, but have I got that wrong?

    it is done sequentially as described by you with one exception - lines from first field are draw with some offset from progressive position (+0.5 line), lines from second field are draw with exactly opposite (-0.5 line) offset from progressive line position.
    Theoretically this provide situation where lines from both fields have complementary position and together they create perfect picture - some imperfections make things worse and this is described by something named "Kell factor" - vertical resolution is usually worse for interlace than for progressive and also this factor is related to technology used by display (optical resolution is usually worse than specified).

    CRT display signal in analog way - it is synchronized by signal itself and it will follow video signal - as always there are some limits - in analog displays this limit is related to components and capabilities (thus purely analog display always trying to display signal and when signal is not correct it can be even destroyed), for digitally controlled analog displays there is additional circuit that verify parameters for incoming signal and if Vertical or Horizontal frequencies are out of specified range then it will refuse to display such signal - instead some error message is displayed. However it is very important to mention that bandwidth is not limited in any way so in theory amount of pixels in line is limited only by video source not by display (display have own limitations but rule is similar to well known oversampling).

    CRT are almost always analog (even when they are digitally controlled so whole timing is also analog so it is flexible and there is no difference between progressive or not video) - there is some very limited amount digital CRT - they work (signal processing) as LCD or PDP thus they behave different.
    Quote Quote  
  17. Originally Posted by Narendra View Post
    Hi All,

    There is long discussion for my post. And Am not understood some of the points.
    My concerns are general.
    Basically we have 2 video displays.
    1. Progressive
    2. Interlace

    In progressive case, If we take 480p resolution,
    Frequency = 800(Total lines/frame)*525(Total pixels/line)*60(frame rate) = 25.2 Mhz.

    As per my understand there are 2 conditions for interlace comes into picture.
    1. In CRT TV's people find that, when the beam writing of last lines, the Already wrote first lines become not good(lost some shade). When again it writing the first lines, the user can find the difficult to see the picture. So people want to reduce the number of lines.
    2. With reducing number of lines we can reduce the bandwidth also.
    So Frequency in Interlace = 400(Total lines/frame)*525(Total pixels/line)*60(frame rate) = 12.6 Mhz

    Even with this also human eye I can see the good image.

    My Concerns are
    1. As for as I know only CRT tv's can output interlaced video. Can CRT tv output progressive video also ???
    2. Latest TV's (LCD,LED, except CRT) why don't use the concept of interlaced video ?? Why they are only progressive by native.
    3. Can CRT computer monitors output intelaced video ??
    4. Computers monitor's(except CRT) also don't use the concept of interlaced video ?? Why they are progressive by native ??
    5. 35mm/70mm screens in theatres, what it supports ??
    Ad. 1 - yes - CRT are capable to display both types video signal - they are so called : sequential type display - video is displayed as series of lines.

    Ad. 2 - they are opposite to sequential - all those modern display technologies are memory type displays - they can display whole picture at once (but due technology limitations - drivers, buses etc they also use multiplexing or very combined/complex addressing schemes)

    Ad.1,2. There is very good analogy between sequential and memory - progressive type of display - imagine two types memories - first serial register (LIFO.FIFO) and RAM - CRT are like serial memory - there is no direct access to each pixel - display is capable only to display whole sequence of pictures (some CRT can have direct access to pixels but those devices are not used to display pictures but for example to display special information or for other purposes - for example oscilloscopes, very fast analog to digital converters etc), LCD or PDP are like RAM - each pixel can be addressed independently and controlled, for color displays even sometimes there is direct access to sub pixels and this is used to improve quality (rasterization).

    Ad. 3 if i understand you correctly then yes

    Ad. 4 interlace have own limitations and there is no sense to use it especially with displays where each pixel can be accessed independently.

    Ad. 5. Cinema use different principle to create picture and this is purely progressive (even more than PDP or LCD or DLP etc)

    Originally Posted by Narendra View Post
    Am I found that in Wikipedia, Only CRT's displays are capable of interlaced video. And Why others are native by default ??
    And there is no future for interlaced video. So, Can't use interlace, greater 1080i resolution ??
    With CRT (with almost all CRT's) you can use any resolution as you not going over limits for V and H synchronization frequency. With interlace you can double amount of lines (with typical interlace scheme with fields but there are very rare displays that use more than 2 fields - mostly used in medicine visualization - they can use 3, 4 fields thus amount of lines is 3 o 4 times bigger at a cost of refresh rate - usually those displays use phosphor with very long persistency or even special memory CRT tubes (tube remember picture until picture is erased by special electrical signal - those tubes in theory can use very complex interlace to provide extremely high resolution picture)


    ---
    after some time
    ---

    http://www.blurbusters.com/zero-motion-blur/lightboost/

    Perhaps with this technique, interlace can be simulated correctly also on progressive displays - not sure - need to think on this.
    Last edited by pandy; 15th Mar 2013 at 08:56.
    Quote Quote  
  18. Originally Posted by hello_hello View Post
    Originally Posted by pandy View Post
    Nope - 60Hz progressive means 120i - what CRT we talking now? Please provide model and brand.
    The CRT monitor is a HP P1230.
    Horizontal frequency 30 to 140 kHz
    Vertical Refresh rate 50 to 160 Hz

    Use this one to

    http://www.epanorama.net/faq/vga2rgb/calc.html

    stick with H and V sync freq, double number lines - it will keep interlace within limits (yes graphics card must output video resized twice in vertical direction but this can be pure repetition - line doubling and it should be not a problem - bandwidth remain same as for progressive) - look at the good side - twice amount of lines for nothing (almost).

    --
    there is plenty available timing calculators
    http://umc.sourceforge.net/
    Last edited by pandy; 15th Mar 2013 at 04:39.
    Quote Quote  
  19. I'll give it a go tomorrow and report back. I don't want to do it now because the PC's in the middle of a long conversion and I'd like to let it finish before I try something that might make it crash..... or the monitor explode.

    So does anyone know why Windows lists refresh rates for the TV such as 24Hz or 25Hz as interlaced?
    Last edited by hello_hello; 15th Mar 2013 at 06:41.
    Quote Quote  
  20. Originally Posted by hello_hello View Post
    So does anyone know why Windows lists refresh rates for the TV such as 24Hz or 26Hz as interlaced?
    24 Hz is common for watching film based sources (no 60 Hz judder). I haven't seen 26 Hz.
    Quote Quote  
  21. Originally Posted by jagabo View Post
    Originally Posted by hello_hello View Post
    So does anyone know why Windows lists refresh rates for the TV such as 24Hz or 26Hz as interlaced?
    24 Hz is common for watching film based sources (no 60 Hz judder). I haven't seen 26 Hz.
    Sorry, that was a typo. I've changed it.
    I'm trying to understand though, as per the screenshot in my earlier post, why Windows lists the refresh rates of 24Hz, 25Hz and 30Hz in Display Properties as "interlaced". I tried them and they certainly appear to be progressive, and if I select 24Hz (as an example) the TV displays 24Hz when it switches the refresh rate.
    Quote Quote  
  22. Originally Posted by hello_hello View Post
    I'll give it a go tomorrow and report back. I don't want to do it now because the PC's in the middle of a long conversion and I'd like to let it finish before I try something that might make it crash..... or the monitor explode.

    So does anyone know why Windows lists refresh rates for the TV such as 24Hz or 25Hz as interlaced?

    Modern CRT are protected against incorrect parameters - don't worry - after setting new mode when there is no video just wait (at least on Windows) 20 sec.

    Perhaps becouse developer was confused as many people and for him anything lower than 60(50) Hz is interlace.

    btw

    as 24Hz mode is simply unrealistic (to low V sync freq) go for 48 or even better 72Hz mode (exactly as in cinema when source of light is interupted 2 or 3 times per each movie frame to reduce perceived flickering).
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!