Are there any advantages to using an analog TV/Monitor in comparison to the newer and more technologically advanced digital Monitors?
I'm not sure but it seems to me as though they're better suited for viewing lower resolution videos (The analog are i suppose).
Since every tv we own in our household is analog except my computer's 22' LCD-TFT and a couple of laptops with their LCD monitor every other TV we have is analog, and based to my observation it seems that running media with resolution close to VGA playback seems better on those Monitors.
I dont why this is, maybe they use better filters or the whle scanning method seems to be working, but when i watch that kind of media on my 22 LCD TFT i can notice the noise that seems to exist and the results of the upscaling.
I really need to hear what you have to say to all of this.![]()
+ Reply to Thread
Results 1 to 30 of 38
-
-
True interlaced display, true black (not a dark gray) and a much, much larger contrast ratio. Also, the gamma response of TVs is different than PC monitors. The latter are comparatively more washed out and less vibrant.
I only use a CRT display to monitor my video. I wish there were affordable HD CRT displays so that I can watch my HDV (interlaced) how it is supposed to be and without scaling etc.
Having said all that(!), the reason to use a CRT video monitor (as opposed to a CRT computer monitor) is so you know what your video will look like on a regular TV. If you intend to watch your video only on a PC display, you should monitor it on a PC display. Professional monitoring still uses CRT (even for HD). There are broadcast quality LCD/plasma monitors but they are hellishly expensive.
Using a CRT TV, though, is not the same as using a proper CRT video monitor. Consumer TVs are not designed for accuracy but rather affordability.John Miller -
I've got two Sony HD CRT's and a Sony HD LCD TV. They're consumer grade but relatively high end which means they have some features / components that are similar to professional quality monitors, at least that's my understanding.
I usually play the footage for any new project on all of them and always play the finished project so I can get a good idea of what is going out the door.
LCD's are unforgiving and harsh. A pixel's a pixel and it better be done right or you'll see it.
CRT's have a great quality of making things look very good.
Most of my audience will be watching on LCD TV's. So I need to make my stuff look good for LCD's.
My HD CRT's put me in a small minority, because most people have HD LCD's or SD CRT's - but it doesn't stop me from loving the way things look on them.
Plus I second everything JohnnyMalaria wrote. -
CRT > LCD forever.
LCD will die and something will replace it before it ever gets as good as a CRT. Been waiting 6 years for ANY LCD to prove me wrong. -
Dittos JohnnyMalaria but I'd go further and say 480i and 567i most always looks better on a good CRT vs. LCD or plasma.
Problem for CRT is dealing with higher resolutions and screen size. CRT's maxed around 34" and 1440x1080 resolution but those became very heavy and impractical. For SD and ED, direct view CRT gives the best black level and best contrast. LCD needs to play backlight tricks to come close. Plasma has good blacks but less contrast.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Guys thanx for verifying that my instincts were right. I thought i was becoming crazy for thinking that a 20 year old cheap CRT-TV made SD resolution media look better than my LG LCD-TFT PC monitor.
But i guess that's they way it is. Also i completely agree with the notion that LCD's are unforgiving and a pixel is a pixel.
Could anyone explain in a little more detail as to why this happens? Why is it that CRT's seem to mask existing noise, while on LCD monitors you can notice it especially if standing too close to the screen. -
An LCD-TFT computer monitor will look worse than an LCD-TV so that is an unfair example the other direction. Computer LCD monitors have progressive RGB vs interlace YUV optimization, more linear gamma and very poor black levels vs. an LCD-TV so at least compare a CRT to an LCD-TV. A computer monitor makes a poor TV.Originally Posted by therock003Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
dLee - what HD CRTs are they? I really don't want to get an HD LCD TV when my 12 year old Trinitron finally croaks (I've resuscitated it a couple of times)...
-
So basically they're only good for running applications and playing games? They're not otimized at all to handle media playback?Originally Posted by edDV
-
Sony WEGA KV-36XBR800 36" 4:3 DVI, component, S-Video and composite in's - As edDV mentioned, it's a beast. The box said it weighed 270 lbs, but the manual says 240 lbs. It takes two guys in relatively good shape to lift it.
Sony Trinitron WEGA KV-30HS510 30" 16:9 DVI, component, S-Video and composite in's. When I bought it I thought this one was top of the line, but it's not. My Dish HD programming looks great. Sports in HD it is the best I've seen. No motion blur, smearing or whatever the correct term is. Might have something to do with the (relatively) small screen size or the difference between the two display technologies.
My son got the top of the line version (XBR, I think) of the 510 in 34". He's got a good LCD TV, but he's recently fallen in love with the CRT again after taking it out of storage.
I think edDV looked them up once and thought that they would max out at 720 (fairly sure they both do i and p). They both will correctly display a 1080 feed from a Dish box via a HDMI - DVI adapter. -
With a good software player (e.g. PowerDVD) you get good results from DVD or computer media files but reception of live TV for display on the computer monitor is less optimal than pass through (HDMI) to an LCD or plasma HDTV that has better image processing hardware. Even if the display card can match the processing hardware of the TV, the computer monitor still lacks black level processing or dynamic contrast.Originally Posted by therock003Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
And even then, you will find many that say LCD's are not good for games, at least the ones where reaction time counts - like FPS type games, because of input lag.Originally Posted by therock003
Also I think it's important to say these are broad based generalizations being discussed here. Certain models are pure crap, certain ones have very high quality. For example you cannot put S-IPS vs. PVA or TN type panels in the same class, they have distinctive characteristics. You can't compare a $5000 Eizo to a $500 Hazro. -
Ok now i'm a little condused. Aren't all CRT's supposed to be analog thus only handling interlaced picture and analog inputs? How can you put dvi and also have a progessive scan on an analog CRT. Unless it's somehow digital as well.I think edDV looked them up once and thought that they would max out at 720 (fairly sure they both do i and p). They both will correctly display a 1080 feed from a Dish box via a HDMI - DVI adapter.
I'm not interested by any means for TV reception on a PC Monitor,only media files stored on the drive or optical media exclusively.With a good software player (e.g. PowerDVD) you get good results from DVD or computer media files but reception of live TV for display on the computer monitor is less optimal than pass through (HDMI) to an LCD or plasma HDTV that has better image processing hardware. Even if the display card can match the processing hardware of the TV, the computer monitor still lacks black level processing or dynamic contrast.
I've used lots of players. Mostly mediaplayer classic and zoom player are my favorites, and my opinions are based on the results of these two players. But if it is a Monitor limitation then i guess you'll never be satisfied no matter how well written the software is.
Right i always wondered why EIZO are so darn expensive. You can get 22 LG or Samsung at 200-300 Euro but Eizos cost 500-700 for this specific screen size and it makes me wonder why? The specs posted seem to be the same though so what gives. Hazro on the other hand i dont know at all...And even then, you will find many that say LCD's are not good for games, at least the ones where reaction time counts - like FPS type games, because of input lag.
Also I think it's important to say these are broad based generalizations being discussed here. Certain models are pure crap, certain ones have very high quality. For example you cannot put S-IPS vs. PVA or TN type panels in the same class, they have distinctive characteristics. You can't compare a $5000 Eizo to a $500 Hazro.
We also have a sony Wega 52 (dont know exact model) it's very old but it was top of the line when my father purchased some long years back. It's a rear projection type though and i find that it shines terribly, and tweaking it to darken provides strange results. Brightness/contrast issues aside i find that it does a good job processing the signal and displaying it on such a large display. -
Interlaced vs. progressive has nothing to do with analog vs. digital. Traditional PC monitors (VGA) are provided with analog signals from the graphics card but can display interlaced and non-interlaced signals.
On my nVidia card, I connect the secondary output to a video monitor via S-video. The card's output is set to 720 x 480i. Playing DV files on the secondary display gives a picture on the video monitor as good as if the camcorder was connected directly.John Miller -
Depends on the analog TV. The classic analog CRT TV (e.g. Sony XBR of the 90's) optimized early for NTSC and PAL with comb filters but added S-Video and better sound for Laserdisc. Pro monitors were doing more but I'll skip that. The main influence on CRT resolution came from the Trinitron/Shadowmask medical and computer monitor business.Originally Posted by therock003
When DVD emerged, a higher than broadcast quality source was available. Interlace 480i/576i required the same display solution as laserdisc. DVD added progressive scan and analog components to the consumer market. 24p progressive DVD (movies only then) could output from players at 59.94 frames per sec (NTSC land) or 50 frames per sec (PAL land) so monitors adapted with progressive analog component at the high end. CRT "dot pitch" was reduced with more or less direct analog display from analog component 720x480p/59.94 input and CRT H scan was switched in for 4:3 or stretched 16:9 display. This was 1998 or so at the high end.
Next HDTV pushed the limits for higher CRT resolution. Dot pitch was made finer but the action was behind the shadow mask/aperture grill. Beam scan was increased vertically to 1080i lines (540 per field). HD progressive scan to 720p was reached on a few higher end CRT models. Horizontal resolution was limited by the display. 720 H (~5.75MHz lines of resolution) was the max required for DVD, H stretched 800-960 (~8MHz) was good enough for 720p broadcast.
On the interlace side 1440x1080i (540 lines per field) was the top resolution but for 34-36" displays this was more than enough. It was black level, contrast and analog frequency rolloff that gave CRT the picture quality edge.
Later digital processors were added to force analog and digital inputs into a digital chipset causing deinterlace and scan conversion to a 960x540p or 640x480p framebuffer that fed the CRT. That is when CRT TV sets began to show 8bit digital artifacts especially in 480i to 480p processing mode. These early processor chipsets had poor performance. CRT displays still looked good at DVD 480i/576i and "1080i" scalings. Some CRT TV sets kept an analog 480p path for DVD.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Many variables.With a good software player (e.g. PowerDVD) you get good results from DVD or computer media files but reception of live TV for display on the computer monitor is less optimal than pass through (HDMI) to an LCD or plasma HDTV that has better image processing hardware. Even if the display card can match the processing hardware of the TV, the computer monitor still lacks black level processing or dynamic contrast.
I'm not interested by any means for TV reception on a PC Monitor,only media files stored on the drive or optical media exclusively.
I've used lots of players. Mostly mediaplayer classic and zoom player are my favorites, and my opinions are based on the results of these two players. But if it is a Monitor limitation then i guess you'll never be satisfied no matter how well written the software is.
Source: interlace vs. progressive
Player: deinterlace or realtime IVTC
Display card: Dumb RAMDAC or good hardware assist
Monitor: limits quality but is small.
PS:Opps I fixed and HTML error above.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Wait so an Analog Monitor can output progressive scan. Interesting, then why is interlaced still USED? I always wondered what's the use of interlaced signal since it is undoubtedly deteriorated in comparison with progressive (at least that's what everyone keeps saying). So that people with OLD TV sets can still receive signal broadcasted on the air? That seriously bugs me, can someone please give me an explanation as to what's good with interlaced that it still seems to be around?Originally Posted by JohnnyMalaria
Thanx for the historic timeline, i will get back to that with some questions later. ATM let's address the variable. My sources are always progressive so there's no deinterlace issues, there are mostly HDTV-rips with resolutions 624x352 approx. and my Monitor is set to 1680x1050.Many variables.
Source: interlace vs. progressive
Player: deinterlace or realtime IVTC
Display card: Dumb RAMDAC or good hardware assist
Monitor: limits quality but is small.
My Card is ATI HD3870 which i guess that doesnt make for a poor RAMDAC (or does it?).
What do you mean by that?limits quality but is small
Are there any other such variables for consideration? -
Some analog monitors can display progressive not your average TV set.Originally Posted by therock003
Examples:
VGA computer monitors
Medical or scientific monitors (ex. X-Rays, Oil exploration)
"HD Ready" CRT TV (native display of DVD 720x480p/59.94 fps video)
Many variables.
Source: interlace vs. progressive
Player: deinterlace or realtime IVTC
Display card: Dumb RAMDAC or good hardware assist
Monitor: limits quality but is small."HDTV-rips with resolutions 624x352 approx." are well outside the conventional and are extremely low resolution maybe from HD source but all the HD has been squeezed out. For comparison, SD DVD 16:9 is stored 720x480 (852x480 expanded) or 720x576 (1024x576 expanded).Originally Posted by therock003
An ATI HD3870 nonwithstanding, expansion from storage resolution is always lower quality vs. display from native resolution or downsized from higher resolution.Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Interlacing receives an unjustified bad press. If analog TV broadcasts were 29.97p instead of 59.94i, there would be protests against the flickering. Interlacing came about in the 1930's as a way to make the image look fluid but fit within the available transmission bandwidth. It works well with CRT's due to the persistence of the phosphor dots.
I prefer 59.94i to 29.97p and when I travel to Europe, the TV seems to flicker very noticeably (50i). But I'm very sensitive to flickering - e.g., LED tail lights on cars that have been done on the cheap. I can even tell you the duty cycle.John Miller -
Yes, it takes me weeks to adjust to Euro PAL. Flicker is worst in peripheral vision.
This used to be a point of controversy with our UK office until I hired an engineer from there to work in Califorina. After a few months here he went back to the UK and couldn't stand the flicker.
BTW interlace gets you twice the channels in the same bandwidth. Would you give up half your 29.97i channels for progressive? Or same number of channels with jumpy 29.97p (no telecine solution) ?Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
CRTs are still very much used in broadcast. Just look at the wall of control room monitors in any news cast...
-
And i suppose the opposite holds true, displaying interlaced content on a Digital Monitor. But is it the same havong a file stored with interlace information as having interlaced feed pass through cables or is there a difference?Some analog monitors can display progressive not your average TV set.
Wait though, isnt HDTV source better than DVD source? Cause it seems that hdtv-rips displayed on my screen look always worse than DVD-Rips of same resolution. Is the HDTV content broadcasted on channels worse than the DVD coming out (I'm talking in termsof TV Shows, which seems that DVD-Rips look better than HDTV-rips, but of course not better than the actual 720p or 1080p content they were actually ripped from)?"HDTV-rips with resolutions 624x352 approx." are well outside the conventional and are extremely low resolution maybe from HD source but all the HD has been squeezed out. For comparison, SD DVD 16:9 is stored 720x480 (852x480 expanded) or 720x576 (1024x576 expanded).
An ATI HD3870 nonwithstanding, expansion from storage resolution is always lower quality vs. display from native resolution or downsized from higher resolution.
You mean like HD material downsized to display on SD resolution CRT's or other non HD resolution monitors?or downsized from higher resolution.
What do you mean by channels, exactly?BTW interlace gets you twice the channels in the same bandwidth. -
TCM?Originally Posted by JohnnyMalaria
Has anyone found a marketplace for HD CRT's?
I've given up on eBay for most things, and the HD CRT is not easily shipped. -
LCD/plasma monitors only display progressive. Incoming interlace material must be deinterlaced or inverse telecined (for film material). Deinterlace is a lossy process.Originally Posted by therock003
Most of the time a higher quality source will compress better. This would include resolution, signal to noise and the number of times the source has been recompressed. TV broadcasts often get resized and/or recompressed several times so for that factor, a commercial DVD has the advantage.Originally Posted by therock003
I was saying the ATI HD3870 can't expand 624x352 for an acceptable SD or HD display. Upward rescale is just an interpolation.Originally Posted by therock003
I meant broadcast channels but this applies to any form of storage or transmission. Progressive video contains twice the data vs interlace at the same frame rate.Originally Posted by therock003Recommends: Kiva.org - Loans that change lives.
http://www.kiva.org/about -
Then that must mean that they all come with some hardware deinterlace solution integrated, or else they would not display an interlaced image at all, like showing a blank screen i guess...Originally Posted by edDV
Why is that? Isnt it as simple as joining the two halves?Deinterlace is a lossy process.
Yes of course but would that mean that a DVD would beat the actual 720p or 1080p broadcasted signal or just an hdtv rip such as the ones i'm accustomed to viewing?Most of the time a higher quality source will compress better. This would include resolution, signal to noise and the number of times the source has been recompressed. TV broadcasts often get resized and/or recompressed several times so for that factor, a commercial DVD has the advantage.
You mean since the multiplier to the screens' resolution is not an integer? So if the file was exactly at half the resolution ie 840x525 would this make it that much better?I was saying the ATI HD3870 can't expand 624x352 for an acceptable SD or HD display. Upward rescale is just an interpolation.
You mean since a frame is a complete frame in comparison to interlace it's frame being a half of a whole one? So interlaced saves you bandwidth is that what you're saying?I meant broadcast channels but this applies to any form of storage or transmission. Progressive video contains twice the data vs interlace at the same frame rate. -
You can do it that way but it will look bad. A progressive frame is designed to be shown at one point in time (e.g., one every 1/30th of a second). Interlaced fields are displayed at different times (one field every 1/60th frame). So you need to do something to estimate what a single frame would be from the two fields. This inherently destroys information.Originally Posted by therock003
I think edTV is comparing standard def DVD to standard def digital transmission such as by satellite. Both are MPEG2 but the broadcast signal has to be compressed more so that enough channels can be squeezed into the frequency space available to the satellite.Yes of course but would that mean that a DVD would beat the actual 720p or 1080p broadcasted signal or just an hdtv rip such as the ones i'm accustomed to viewing?Most of the time a higher quality source will compress better. This would include resolution, signal to noise and the number of times the source has been recompressed. TV broadcasts often get resized and/or recompressed several times so for that factor, a commercial DVD has the advantage.
LCD TVs have a fixed resolution - unlike an analog CRT monitor. Moreover, most LCD TVs have really stupid resolutions that mean almost every video format needs to be rescaled. You could do a simple integer-type rescale but it would look horrible. More fancy rescaling is necessary.You mean since the multiplier to the screens' resolution is not an integer? So if the file was exactly at half the resolution ie 840x525 would this make it that much better?I was saying the ATI HD3870 can't expand 624x352 for an acceptable SD or HD display. Upward rescale is just an interpolation.
That is exactly the reason interlacing was invented. It provides an update rate of ~60 per second (50 in Europe) with the compromise of reduced resolution. Full resolution frames at ~30/25 per second have too much flicker. Cinema film projectors employ a somewhat similar trick to stop the 24fps flickering - each frame is projected twice to give an effective frame rate of 48fps.You mean since a frame is a complete frame in comparison to interlace it's frame being a half of a whole one? So interlaced saves you bandwidth is that what you're saying?I meant broadcast channels but this applies to any form of storage or transmission. Progressive video contains twice the data vs interlace at the same frame rate.John Miller
Similar Threads
-
advantages of transcoding mpeg2
By codemaster in forum EditingReplies: 4Last Post: 26th Apr 2012, 17:11 -
Advantages of using M2TS over .TS with TSmuxerGUI
By jack616 in forum Newbie / General discussionsReplies: 1Last Post: 31st Oct 2011, 12:35 -
What advantages are there in using .MP4 over .MKV if any
By Onceler2 in forum Video ConversionReplies: 2Last Post: 26th Jul 2011, 16:53 -
Advantages of ISO?
By hardy in forum Newbie / General discussionsReplies: 6Last Post: 27th Sep 2010, 18:42 -
Analog Cable TV to HD LCD Monitor via TV tuner
By torrentkid in forum DVB / IPTVReplies: 0Last Post: 3rd Aug 2010, 07:19



Quote