VideoHelp Forum




+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 52
  1. Why Computer monitor doesn't have capability to display interlaced video ??
    Quote Quote  
  2. Computer monitors and LCD/Plasma TVs are progressive, but they can display interlaced video, it's just got to be de-interlaced somewhere in the playback chain to display properly. Usually either the decoder, player or video card would do it, but it's probably got to be correctly identified as interlaced video for it to be de-interlaced automatically.
    Quote Quote  
  3. Hi hello_hello,

    What is the difference between TV's and Computer monitor's here ?
    If interlace video reduces the bandwidth half, why we don't go for interlace video in computer monitor case ?
    How computer monitor is different from CRT monitor ?
    Quote Quote  
  4. Computer monitors can be CRT monitors. I'm using a CRT computer monitor as I type. It's progressive in that it only accepts a progressive input. I guess that's the main difference between TVs and computer monitors, regardless of the type of monitor. LCD/Plasma TVs can accept an interlaced input and de-interlace it to display it on their progressive screen. Computer monitors can't, but the video card in a PC should be able to if it knows the video is interlaced.

    Interlaced video isn't actually 29.97 or 25 frames per second etc. Well it is.... but it's 59.94 or 50 fields per second, with each field being half a frame. Only each field is actually every second scan line in a frame and combined they make up the total frame. There's advantages and disadvantages to interlaced video. You might be better off looking up interlacing on wikipedia for a detailed explanation as to why it can be a good thing and also a bad thing. It'll no doubt explain it better than I can, and save me a lot of typing.

    Interlacing had/has a place in video, but I doubt there's any reason to use it for actual computer monitors. The monitor is connected directly to the PC so there wouldn't be the same bandwidth issues. And it probably wouldn't work well anyway.
    Quote Quote  
  5. Computers were natively progressive (with a few exceptions like the original IBM CGA card). So there was no need for interlaced support in computer monitors. TV was natively interlaced. There was no need for progressive support in TVs.

    Interlaced video uses only half the bandwidth but it also delivers an inferior, flickery picture. It was a compromise for broadcast back in the 1940s.
    Last edited by jagabo; 6th Mar 2013 at 07:54.
    Quote Quote  
  6. Hi All,

    Thanks for replies.
    Quote Quote  
  7. Hi Jagabo,

    1. TV's only for seeing video's not any other purpose. So Interlace format is ok with this(~60 field/sec).
    2. Computer Monitors not only for seeing video, it's also has several features, that why it's input is only progressive. Otherwise we will see a image with flickery in interlace case.

    Is my assumptions are right ??
    Quote Quote  
  8. Here's what the one of the fields which make up an interlaced frame looks like:

    Click image for larger version

Name:	field.jpg
Views:	4386
Size:	21.6 KB
ID:	16704

    Here's what interlaced video looks like on a progressive screen when the two fields which make up a frame are simply combined. Every second line makes up one field, while the alternate lines make up the next field, but each field represents a slightly different moment in time. So when there's movement between each field, the result is the lines you see.

    Click image for larger version

Name:	combined.jpg
Views:	4297
Size:	34.6 KB
ID:	16705

    It worked fine for CRT TVs because that's the way the video is drawn on the screen, and it all happens so fast you can't see it so everything looks fluid. For a progressive display though, at some point those two fields need to be combined to make up a single frame which represents a single moment in time. The process for doing so is de-interlacing. There's various methods for doing this, and none is really perfect, but as a general rule the two fields can be combined and the resulting progressive frame looks like it was always just a single frame.

    Click image for larger version

Name:	deinterlaced.jpg
Views:	4273
Size:	18.5 KB
ID:	16706
    Quote Quote  
  9. Hi hello_hello,

    Thanks for the reply.
    Quote Quote  
  10. Originally Posted by Narendra View Post
    Why Computer monitor doesn't have capability to display interlaced video ??

    It have FULL capability to display both (progressive and interlace) type of content.
    Quote Quote  
  11. Computer monitors generally don't have the ability to display interlaced video in its native format. Most don't even accept interlaced input. It's the computer the monitor is attached to to that does the conversion to progressive.
    Quote Quote  
  12. Originally Posted by jagabo View Post
    Computer monitors generally don't have the ability to display interlaced video in its native format. Most don't even accept interlaced input. It's the computer the monitor is attached to to that does the conversion to progressive.
    This is complex problem - any (with some limitations related to H, V sync frequencies) computer display can accept and display interlace signal however not many computer displays that are progressive by technology (LCD, PDP, OLED similar) can display interlace video correctly (i.e. performing deinterlace before displaying).

    CRT computer displays are quite happy (they work without problems) with any interlace video generated by video card until video signal fit in H, V sync frequencies range.

    Display ( CRT but also LCD, PDP, OLED and other technologies) need Horizontal and Vertical synchronization pulses.

    H and V sync pulses must be between some limits - if video signal fit in H and V sync range then display have no problems with displaying progressive or interlace.
    Last edited by pandy; 13th Mar 2013 at 08:24. Reason: complains - fixed (?) and reordered
    Quote Quote  
  13. Originally Posted by pandy View Post
    CRT computers displays are happy with any interlace video generated by card until video signal fit in H, V sync frequencies range.
    Can you explain that in English? I'm not aware of interlaced video ever displaying correctly on my CRT monitors unless it's de-interlaced somewhere in the playback chain. Mind you I don't know how to test it. If the video has an interlaced flag then the video card will de-interlace it if nothing else does, so I don't know how to output interlaced video to the monitor while telling it the video is interlaced, as there's no interlaced option when setting the refresh rate for the monitor as there is when connecting the PC to the TV. As soon as the video is flagged interlaced, the video card de-interlaces it. At least that's how I assume it works.
    Quote Quote  
  14. Originally Posted by hello_hello View Post
    Originally Posted by pandy View Post
    CRT computers displays are happy with any interlace video generated by card until video signal fit in H, V sync frequencies range.
    Can you explain that in English? I'm not aware of interlaced video ever displaying correctly on my CRT monitors unless it's de-interlaced somewhere in the playback chain.
    I bolded is the critical part. Most computer monitors aren't designed to sync to the horizontal and vertical sync rates of standard interlaced TV signals. Because, traditionally, there has been no need. It's only in recent years that TV and computers are starting to converge.
    Quote Quote  
  15. Originally Posted by hello_hello View Post
    Originally Posted by pandy View Post
    CRT computers displays are happy with any interlace video generated by card until video signal fit in H, V sync frequencies range.
    Can you explain that in English?
    Reedited my previous statement - probably anyway not good enough for You.

    Originally Posted by hello_hello View Post
    I'm not aware of interlaced video ever displaying correctly on my CRT monitors unless it's de-interlaced somewhere in the playback chain. Mind you I don't know how to test it. If the video has an interlaced flag then the video card will de-interlace it if nothing else does, so I don't know how to output interlaced video to the monitor while telling it the video is interlaced, as there's no interlaced option when setting the refresh rate for the monitor as there is when connecting the PC to the TV. As soon as the video is flagged interlaced, the video card de-interlaces it. At least that's how I assume it works.
    Deinterlacing is only required for progressive display (or interlace video is displayed with error - two fields at the same time) - any modern video graphic card should be capable to provide analog interlace video that can be displayed by CRT display.
    Only H and V sync frequencies must be within those specified by CRT.

    To see interlace video, interlace video mode must be selected by renderer - in NVidia cards, user can define own, non-standard mode or select one from predefined (for example 1080i25 is available on preset list - then pixel clock = 74.25MHz, Vsync = 50Hz, Hsync = 28.125kHz), Similar is for ATI(AMD) graphic cards.

    Basic signal parameters are provided by for example http://www.silabs.com/Support%20Documents/TechnicalDocs/AN377.pdf
    Quote Quote  
  16. Originally Posted by jagabo View Post
    Originally Posted by hello_hello View Post
    Originally Posted by pandy View Post
    CRT computers displays are happy with any interlace video generated by card until video signal fit in H, V sync frequencies range.
    Can you explain that in English? I'm not aware of interlaced video ever displaying correctly on my CRT monitors unless it's de-interlaced somewhere in the playback chain.
    I bolded is the critical part. Most computer monitors aren't designed to sync to the horizontal and vertical sync rates of standard interlaced TV signals. Because, traditionally, there has been no need. It's only in recent years that TV and computers are starting to converge.
    Most digital (analog controled by MCU or fully digital) displays are capable to support very wide range for H and V frequencies - problem is usually related to way how they are programmed (vendors simply focus on PC market where VGA - NTSC related frequencies are dominating thus usualy vendors limiting supported video modes - Hsync something around 31kHz and for Vsync something around 60Hz - however it is very important to understand that same components are used (sometimes) also with displays that support much lower H and V sync frequencies).

    Simple workaround for this can be doubling number of lines thus increase Hsync twice (video mode is 1920x2160i25 where Vsync remain 50Hz however Hsync will be 56.25kHz) - CRT displays have usually no problem with such signal (I've tried - it works).


    And to simplify - each analog or emulating analog display is so stupid that it will try to display anything - some smarter (all modern) displays have device that checking H and V sync frequencies - if incoming signal fit inside specified range then it will be displayed - CRT are simply more flexible than other display technologies.
    Last edited by pandy; 13th Mar 2013 at 09:08.
    Quote Quote  
  17. Banned
    Join Date
    Oct 2004
    Location
    New York, US
    Search Comp PM
    I'd prefer that the PC display interlaced video the way it does, as interlaced or telecined, so I can see what the heck I'm doing. I think it would matter if you're simply using your PC as a player. But for a player, I spent plenty of $$$$ and time on components and TV's that do it correctly.
    Last edited by sanlyn; 26th Mar 2014 at 05:44.
    Quote Quote  
  18. Originally Posted by pandy View Post
    problem is usually related to way how they are programmed (vendors simply focus on PC market
    And that is my whole point.
    Quote Quote  
  19. Originally Posted by jagabo View Post
    Originally Posted by pandy View Post
    problem is usually related to way how they are programmed (vendors simply focus on PC market
    And that is my whole point.
    And my point is that PC displays can display interlace video but usually with error (two fields at the same time) - with some effort, CRT displays PC type can be used to display correct interlaced video.
    Quote Quote  
  20. Originally Posted by pandy View Post
    And my point is that PC displays can display interlace video but usually with error (two fields at the same time)
    No, they're displaying progressive video -- the computer is sending a progressive signal. It just happens to contain data from an interlaced source.
    Quote Quote  
  21. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by jagabo View Post
    Originally Posted by pandy View Post
    problem is usually related to way how they are programmed (vendors simply focus on PC market
    And that is my whole point.
    Exactly. They could make a computer monitor that plays MP3s from a USB stick or make a computer monitor that doubles as an alarm clock. I have seen TVs that can do those things. ...but there is no reason to add these features to computer monitors just as there is no reason make a computer monitor that accepts interlaced video input and displays it correctly. Manufacturers only need to make computer monitors that do one thing, display progressive output from a video card, so that is what they do. If somebody wants a display they can use with a computer and also displays interlaced video correctly, they get a TV with a video input set aside for PC use.
    Quote Quote  
  22. Originally Posted by jagabo View Post
    Originally Posted by pandy View Post
    And my point is that PC displays can display interlace video but usually with error (two fields at the same time)
    No, they're displaying progressive video -- the computer is sending a progressive signal. It just happens to contain data from an interlaced source.
    If you current video mode is interlace then it will be interlace picture - if you not trust me check yourself.

    It is difficult to expect interlace when your video source is progressive.
    Quote Quote  
  23. Originally Posted by pandy View Post
    Originally Posted by jagabo View Post
    Originally Posted by pandy View Post
    And my point is that PC displays can display interlace video but usually with error (two fields at the same time)
    No, they're displaying progressive video -- the computer is sending a progressive signal. It just happens to contain data from an interlaced source.
    If you current video mode is interlace then it will be interlace picture
    Except of course, and once again, most computer monitors are not programmed to accept interlaced video signals.
    Quote Quote  
  24. Originally Posted by usually_quiet View Post
    just as there is no reason make a computer monitor that accepts interlaced video input and displays it correctly.
    Interlace is only one of ways how to deal with limited bandwidth and interlace was used by computers many years ago as standard method for high resolution graphics and still can be used for this. Problem is that most popular (nowadays) display technologies are progressive and fixed resolution thus interlace have no sense. However if display is a CRT then even modern PC is capable to display correctly interlace video.

    One of popular examples for PC interlace display is https://en.wikipedia.org/wiki/IBM_8514

    http://members.chello.at/theodor.lauppert/computer/ps2/8514a.htm
    http://books.google.nl/books?id=KjwEAAAAMBAJ&pg=PA51&redir_esc=y#v=onepage&q&f=false

    There was many interlace displays in past and interlace is not limited to TV world.
    Quote Quote  
  25. Originally Posted by jagabo View Post
    Except of course, and once again, most computer monitors are not programmed to accept interlaced video signals.
    Once again - for CRT if your video signal fit inside accepted H and V sync frequencies then you can display interlace and your display don't care if video mode is interlace or not - this is up to source to create correct temporal structure of signals and any modern PC graphics card should be capable to create normal interlace video.
    Quote Quote  
  26. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by pandy View Post
    Originally Posted by usually_quiet View Post
    just as there is no reason make a computer monitor that accepts interlaced video input and displays it correctly.
    Interlace is only one of ways how to deal with limited bandwidth and interlace was used by computers many years ago as standard method for high resolution graphics and still can be used for this. Problem is that most popular (nowadays) display technologies are progressive and fixed resolution thus interlace have no sense. However if display is a CRT then even modern PC is capable to display correctly interlace video.

    One of popular examples for PC interlace display is https://en.wikipedia.org/wiki/IBM_8514

    http://members.chello.at/theodor.lauppert/computer/ps2/8514a.htm
    http://books.google.nl/books?id=KjwEAAAAMBAJ&pg=PA51&redir_esc=y#v=onepage&q&f=false

    There was many interlace displays in past and interlace is not limited to TV world.
    More pointless drivel. This is NOT the past, nor are your inane thoughts about what monitor should be able to do in theory of any use to the OP. The fact is that manufacturers of today no longer make consumer-grade computer monitors that can properly handle interlaced video input, unless the monitor is actually a TV.
    Quote Quote  
  27. Member
    Join Date
    Jan 2007
    Location
    United States
    Search Comp PM
    interlacing was developed because of the phosphorus used on the inide of crt screens

    ever watch see an old green radar screen in an old movie, it fades, the image fades
    that is the persistance or lack of , in the phosphor coating inside the crt

    when TV was developed,
    they discovered that the top of the frame would fade by the time the raster (electron scan beam) got to the bottom
    so they split the beam scan into 'two' feilds, every other scan line,
    one scan top feild all the odd numbers 1-3-5-7 etc
    bottom feild all the even numbets 2-4-6-8 etc
    so before the scan starts to fade the second feild is being displayed

    this does Not save bandwidth, 60 feilds a second uses the same amount of data as 30 frames a second
    Last edited by theewizard; 13th Mar 2013 at 12:24.
    Quote Quote  
  28. Originally Posted by usually_quiet View Post
    More pointless drivel. This is NOT the past, nor are your inane thoughts about what monitor should be able to do in theory of any use to the OP. The fact is that manufacturers of today no longer make consumer-grade computer monitors that can properly handle interlaced video input, unless the monitor is actually a TV.
    As almost always you are OT - OP never ask about practical things - CRT's are still available and you can buy them without problems.
    And my point is clear: any modern CRT can display interlace correctly, progressive type displays can display interlace with errors and any modern graphics card can output interlace.
    If you don't know how to configure HW and SW then pay for professional support but don't blame me that you have no idea how to do it.
    Last edited by pandy; 13th Mar 2013 at 12:13.
    Quote Quote  
  29. Originally Posted by theewizard View Post
    ever watch see an old green radar screen in an old movie, it fades, the image fades
    that is the persistance or lack of , in the phosphor coating inide the crt
    Phosphor persistency can be very short (bellow ns) or very long - tens of minutes - with special CRT even longer (CRT with memory provided by Tektronix in their graphics terminals).

    Longer phosphor persistency reduce interlace flickering - older CRT are better than newer - newer CRT use phosphor with lower persistency thus refresh rates bellow 60Hz can be unpleasant - 50Hz flicker even for progressive.

    Originally Posted by theewizard View Post
    this doesnot save bandwidth, 60 feilds a second uses the same amount of data as 30 frames a second
    It will save bandwidth - static picture can have full spatial resolution (reduced by Kell factor), for motion video it have twice time resolution at a cost spatial resolution in vertical direction - this perfectly matching human psycho-visual system model where full resolution can be achieved for static or slowly moving objects and for fast moving objects spatial resolution is reduced.

    From this point interlace saves bandwidth and provide best quality vs bandwidth.
    Quote Quote  
  30. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by pandy View Post
    Originally Posted by usually_quiet View Post
    More pointless drivel. This is NOT the past, nor are your inane thoughts about what monitor should be able to do in theory of any use to the OP. The fact is that manufacturers of today no longer make consumer-grade computer monitors that can properly handle interlaced video input, unless the monitor is actually a TV.
    As almost always you are OT - OP never ask about practical things - CRT's are still available and you can buy them without problems.
    And my point is clear: any modern CRT can display interlace correctly, progressive type displays can display interlace with errors and any modern graphics card can output interlace.
    If you don't know how to configure HW and SW then pay for professional support but don't blame me that you have no idea how to do it.
    The op asked "Why Computer monitor doesn't have capability to display interlaced video ??" The OP also asked " How computer monitor is different from CRT monitor ?" That sounds like the OP wants information about something current, concrete and practical regarding LCD monitors, not a theoretical discussion of what would be possible if typical consumer computer monitors were made differently than they are now.

    No, I can't get a CRT monitor without problems. Nobody sells newly made consumer CRT computer monitors here anymore. Thrift shops don't even want them. That has been the case for a few years. The only new CRT monitors for sale now are specialty items intended for manufacturers to use in cash registers, or vending kiosks, or for professional video work, and other non-consumer applications. They are not much use to anyone who wants a display for their PC. Plus, the OP eventually made it clear that wanted to know why non-CRT computer monitors can't display interlaced video.

    I don't I need professional help with my hardware or software. They are working exactly as their makers intended them to work. Trying to make them do something the manufacturer does not recommend or allow under normal circumstances would be a truly stupid waste of my time.
    Last edited by usually_quiet; 13th Mar 2013 at 17:49. Reason: brevity, clarity.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!