VideoHelp Forum
+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 38
Thread
  1. Are there any advantages to using an analog TV/Monitor in comparison to the newer and more technologically advanced digital Monitors?

    I'm not sure but it seems to me as though they're better suited for viewing lower resolution videos (The analog are i suppose).

    Since every tv we own in our household is analog except my computer's 22' LCD-TFT and a couple of laptops with their LCD monitor every other TV we have is analog, and based to my observation it seems that running media with resolution close to VGA playback seems better on those Monitors.

    I dont why this is, maybe they use better filters or the whle scanning method seems to be working, but when i watch that kind of media on my 22 LCD TFT i can notice the noise that seems to exist and the results of the upscaling.

    I really need to hear what you have to say to all of this.
    Quote Quote  
  2. True interlaced display, true black (not a dark gray) and a much, much larger contrast ratio. Also, the gamma response of TVs is different than PC monitors. The latter are comparatively more washed out and less vibrant.

    I only use a CRT display to monitor my video. I wish there were affordable HD CRT displays so that I can watch my HDV (interlaced) how it is supposed to be and without scaling etc.

    Having said all that(!), the reason to use a CRT video monitor (as opposed to a CRT computer monitor) is so you know what your video will look like on a regular TV. If you intend to watch your video only on a PC display, you should monitor it on a PC display. Professional monitoring still uses CRT (even for HD). There are broadcast quality LCD/plasma monitors but they are hellishly expensive.

    Using a CRT TV, though, is not the same as using a proper CRT video monitor. Consumer TVs are not designed for accuracy but rather affordability.
    John Miller
    Quote Quote  
  3. Member
    Join Date
    Jan 2006
    Location
    Northern Pacific SW
    Search Comp PM
    I've got two Sony HD CRT's and a Sony HD LCD TV. They're consumer grade but relatively high end which means they have some features / components that are similar to professional quality monitors, at least that's my understanding.

    I usually play the footage for any new project on all of them and always play the finished project so I can get a good idea of what is going out the door.

    LCD's are unforgiving and harsh. A pixel's a pixel and it better be done right or you'll see it.

    CRT's have a great quality of making things look very good.

    Most of my audience will be watching on LCD TV's. So I need to make my stuff look good for LCD's.

    My HD CRT's put me in a small minority, because most people have HD LCD's or SD CRT's - but it doesn't stop me from loving the way things look on them.

    Plus I second everything JohnnyMalaria wrote.
    Quote Quote  
  4. CRT > LCD forever.

    LCD will die and something will replace it before it ever gets as good as a CRT. Been waiting 6 years for ANY LCD to prove me wrong.
    Quote Quote  
  5. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Dittos JohnnyMalaria but I'd go further and say 480i and 567i most always looks better on a good CRT vs. LCD or plasma.

    Problem for CRT is dealing with higher resolutions and screen size. CRT's maxed around 34" and 1440x1080 resolution but those became very heavy and impractical. For SD and ED, direct view CRT gives the best black level and best contrast. LCD needs to play backlight tricks to come close. Plasma has good blacks but less contrast.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  6. Guys thanx for verifying that my instincts were right. I thought i was becoming crazy for thinking that a 20 year old cheap CRT-TV made SD resolution media look better than my LG LCD-TFT PC monitor.

    But i guess that's they way it is. Also i completely agree with the notion that LCD's are unforgiving and a pixel is a pixel.

    Could anyone explain in a little more detail as to why this happens? Why is it that CRT's seem to mask existing noise, while on LCD monitors you can notice it especially if standing too close to the screen.
    Quote Quote  
  7. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by therock003
    Guys thanx for verifying that my instincts were right. I thought i was becoming crazy for thinking that a 20 year old cheap CRT-TV made SD resolution media look better than my LG LCD-TFT PC monitor.
    An LCD-TFT computer monitor will look worse than an LCD-TV so that is an unfair example the other direction. Computer LCD monitors have progressive RGB vs interlace YUV optimization, more linear gamma and very poor black levels vs. an LCD-TV so at least compare a CRT to an LCD-TV. A computer monitor makes a poor TV.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  8. dLee - what HD CRTs are they? I really don't want to get an HD LCD TV when my 12 year old Trinitron finally croaks (I've resuscitated it a couple of times)...
    Quote Quote  
  9. Originally Posted by edDV
    Originally Posted by therock003
    Guys thanx for verifying that my instincts were right. I thought i was becoming crazy for thinking that a 20 year old cheap CRT-TV made SD resolution media look better than my LG LCD-TFT PC monitor.
    An LCD-TFT computer monitor will look worse than an LCD-TV so that is an unfair example the other direction. Computer LCD monitors have progressive RGB vs interlace YUV optimization, more linear gamma and very poor black levels vs. an LCD-TV so at least compare a CRT to an LCD-TV. A computer monitor makes a poor TV.
    So basically they're only good for running applications and playing games? They're not otimized at all to handle media playback?
    Quote Quote  
  10. Originally Posted by therock003
    So basically they're only good for running applications and playing games? They're not otimized at all to handle media playback?
    Pretty much.
    John Miller
    Quote Quote  
  11. Member
    Join Date
    Jan 2006
    Location
    Northern Pacific SW
    Search Comp PM
    Sony WEGA KV-36XBR800 36" 4:3 DVI, component, S-Video and composite in's - As edDV mentioned, it's a beast. The box said it weighed 270 lbs, but the manual says 240 lbs. It takes two guys in relatively good shape to lift it.

    Sony Trinitron WEGA KV-30HS510 30" 16:9 DVI, component, S-Video and composite in's. When I bought it I thought this one was top of the line, but it's not. My Dish HD programming looks great. Sports in HD it is the best I've seen. No motion blur, smearing or whatever the correct term is. Might have something to do with the (relatively) small screen size or the difference between the two display technologies.

    My son got the top of the line version (XBR, I think) of the 510 in 34". He's got a good LCD TV, but he's recently fallen in love with the CRT again after taking it out of storage.

    I think edDV looked them up once and thought that they would max out at 720 (fairly sure they both do i and p). They both will correctly display a 1080 feed from a Dish box via a HDMI - DVI adapter.
    Quote Quote  
  12. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by therock003
    Originally Posted by edDV
    Originally Posted by therock003
    Guys thanx for verifying that my instincts were right. I thought i was becoming crazy for thinking that a 20 year old cheap CRT-TV made SD resolution media look better than my LG LCD-TFT PC monitor.
    An LCD-TFT computer monitor will look worse than an LCD-TV so that is an unfair example the other direction. Computer LCD monitors have progressive RGB vs interlace YUV optimization, more linear gamma and very poor black levels vs. an LCD-TV so at least compare a CRT to an LCD-TV. A computer monitor makes a poor TV.
    So basically they're only good for running applications and playing games? They're not otimized at all to handle media playback?
    With a good software player (e.g. PowerDVD) you get good results from DVD or computer media files but reception of live TV for display on the computer monitor is less optimal than pass through (HDMI) to an LCD or plasma HDTV that has better image processing hardware. Even if the display card can match the processing hardware of the TV, the computer monitor still lacks black level processing or dynamic contrast.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  13. Originally Posted by therock003
    So basically they're only good for running applications and playing games? They're not otimized at all to handle media playback?
    And even then, you will find many that say LCD's are not good for games, at least the ones where reaction time counts - like FPS type games, because of input lag.

    Also I think it's important to say these are broad based generalizations being discussed here. Certain models are pure crap, certain ones have very high quality. For example you cannot put S-IPS vs. PVA or TN type panels in the same class, they have distinctive characteristics. You can't compare a $5000 Eizo to a $500 Hazro.
    Quote Quote  
  14. I think edDV looked them up once and thought that they would max out at 720 (fairly sure they both do i and p). They both will correctly display a 1080 feed from a Dish box via a HDMI - DVI adapter.
    Ok now i'm a little condused. Aren't all CRT's supposed to be analog thus only handling interlaced picture and analog inputs? How can you put dvi and also have a progessive scan on an analog CRT. Unless it's somehow digital as well.

    With a good software player (e.g. PowerDVD) you get good results from DVD or computer media files but reception of live TV for display on the computer monitor is less optimal than pass through (HDMI) to an LCD or plasma HDTV that has better image processing hardware. Even if the display card can match the processing hardware of the TV, the computer monitor still lacks black level processing or dynamic contrast.
    I'm not interested by any means for TV reception on a PC Monitor,only media files stored on the drive or optical media exclusively.

    I've used lots of players. Mostly mediaplayer classic and zoom player are my favorites, and my opinions are based on the results of these two players. But if it is a Monitor limitation then i guess you'll never be satisfied no matter how well written the software is.

    And even then, you will find many that say LCD's are not good for games, at least the ones where reaction time counts - like FPS type games, because of input lag.

    Also I think it's important to say these are broad based generalizations being discussed here. Certain models are pure crap, certain ones have very high quality. For example you cannot put S-IPS vs. PVA or TN type panels in the same class, they have distinctive characteristics. You can't compare a $5000 Eizo to a $500 Hazro.
    Right i always wondered why EIZO are so darn expensive. You can get 22 LG or Samsung at 200-300 Euro but Eizos cost 500-700 for this specific screen size and it makes me wonder why? The specs posted seem to be the same though so what gives. Hazro on the other hand i dont know at all...

    We also have a sony Wega 52 (dont know exact model) it's very old but it was top of the line when my father purchased some long years back. It's a rear projection type though and i find that it shines terribly, and tweaking it to darken provides strange results. Brightness/contrast issues aside i find that it does a good job processing the signal and displaying it on such a large display.
    Quote Quote  
  15. Interlaced vs. progressive has nothing to do with analog vs. digital. Traditional PC monitors (VGA) are provided with analog signals from the graphics card but can display interlaced and non-interlaced signals.

    On my nVidia card, I connect the secondary output to a video monitor via S-video. The card's output is set to 720 x 480i. Playing DV files on the secondary display gives a picture on the video monitor as good as if the camcorder was connected directly.
    John Miller
    Quote Quote  
  16. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by therock003
    I think edDV looked them up once and thought that they would max out at 720 (fairly sure they both do i and p). They both will correctly display a 1080 feed from a Dish box via a HDMI - DVI adapter.
    Ok now i'm a little condused. Aren't all CRT's supposed to be analog thus only handling interlaced picture and analog inputs? How can you put dvi and also have a progessive scan on an analog CRT. Unless it's somehow digital as well.
    Depends on the analog TV. The classic analog CRT TV (e.g. Sony XBR of the 90's) optimized early for NTSC and PAL with comb filters but added S-Video and better sound for Laserdisc. Pro monitors were doing more but I'll skip that. The main influence on CRT resolution came from the Trinitron/Shadowmask medical and computer monitor business.

    When DVD emerged, a higher than broadcast quality source was available. Interlace 480i/576i required the same display solution as laserdisc. DVD added progressive scan and analog components to the consumer market. 24p progressive DVD (movies only then) could output from players at 59.94 frames per sec (NTSC land) or 50 frames per sec (PAL land) so monitors adapted with progressive analog component at the high end. CRT "dot pitch" was reduced with more or less direct analog display from analog component 720x480p/59.94 input and CRT H scan was switched in for 4:3 or stretched 16:9 display. This was 1998 or so at the high end.

    Next HDTV pushed the limits for higher CRT resolution. Dot pitch was made finer but the action was behind the shadow mask/aperture grill. Beam scan was increased vertically to 1080i lines (540 per field). HD progressive scan to 720p was reached on a few higher end CRT models. Horizontal resolution was limited by the display. 720 H (~5.75MHz lines of resolution) was the max required for DVD, H stretched 800-960 (~8MHz) was good enough for 720p broadcast.

    On the interlace side 1440x1080i (540 lines per field) was the top resolution but for 34-36" displays this was more than enough. It was black level, contrast and analog frequency rolloff that gave CRT the picture quality edge.

    Later digital processors were added to force analog and digital inputs into a digital chipset causing deinterlace and scan conversion to a 960x540p or 640x480p framebuffer that fed the CRT. That is when CRT TV sets began to show 8bit digital artifacts especially in 480i to 480p processing mode. These early processor chipsets had poor performance. CRT displays still looked good at DVD 480i/576i and "1080i" scalings. Some CRT TV sets kept an analog 480p path for DVD.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  17. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    With a good software player (e.g. PowerDVD) you get good results from DVD or computer media files but reception of live TV for display on the computer monitor is less optimal than pass through (HDMI) to an LCD or plasma HDTV that has better image processing hardware. Even if the display card can match the processing hardware of the TV, the computer monitor still lacks black level processing or dynamic contrast.

    I'm not interested by any means for TV reception on a PC Monitor,only media files stored on the drive or optical media exclusively.

    I've used lots of players. Mostly mediaplayer classic and zoom player are my favorites, and my opinions are based on the results of these two players. But if it is a Monitor limitation then i guess you'll never be satisfied no matter how well written the software is.
    Many variables.

    Source: interlace vs. progressive
    Player: deinterlace or realtime IVTC
    Display card: Dumb RAMDAC or good hardware assist
    Monitor: limits quality but is small.


    PS:Opps I fixed and HTML error above.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  18. Originally Posted by JohnnyMalaria
    Interlaced vs. progressive has nothing to do with analog vs. digital. Traditional PC monitors (VGA) are provided with analog signals from the graphics card but can display interlaced and non-interlaced signals.
    Wait so an Analog Monitor can output progressive scan. Interesting, then why is interlaced still USED? I always wondered what's the use of interlaced signal since it is undoubtedly deteriorated in comparison with progressive (at least that's what everyone keeps saying). So that people with OLD TV sets can still receive signal broadcasted on the air? That seriously bugs me, can someone please give me an explanation as to what's good with interlaced that it still seems to be around?

    Many variables.

    Source: interlace vs. progressive
    Player: deinterlace or realtime IVTC
    Display card: Dumb RAMDAC or good hardware assist
    Monitor: limits quality but is small.
    Thanx for the historic timeline, i will get back to that with some questions later. ATM let's address the variable. My sources are always progressive so there's no deinterlace issues, there are mostly HDTV-rips with resolutions 624x352 approx. and my Monitor is set to 1680x1050.

    My Card is ATI HD3870 which i guess that doesnt make for a poor RAMDAC (or does it?).

    limits quality but is small
    What do you mean by that?

    Are there any other such variables for consideration?
    Quote Quote  
  19. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by therock003
    Originally Posted by JohnnyMalaria
    Interlaced vs. progressive has nothing to do with analog vs. digital. Traditional PC monitors (VGA) are provided with analog signals from the graphics card but can display interlaced and non-interlaced signals.
    Wait so an Analog Monitor can output progressive scan. Interesting, then why is interlaced still USED? I always wondered what's the use of interlaced signal since it is undoubtedly deteriorated in comparison with progressive (at least that's what everyone keeps saying). So that people with OLD TV sets can still receive signal broadcasted on the air? That seriously bugs me, can someone please give me an explanation as to what's good with interlaced that it still seems to be around?
    Some analog monitors can display progressive not your average TV set.

    Examples:
    VGA computer monitors
    Medical or scientific monitors (ex. X-Rays, Oil exploration)
    "HD Ready" CRT TV (native display of DVD 720x480p/59.94 fps video)


    Many variables.

    Source: interlace vs. progressive
    Player: deinterlace or realtime IVTC
    Display card: Dumb RAMDAC or good hardware assist
    Monitor: limits quality but is small.
    Originally Posted by therock003
    Thanx for the historic timeline, i will get back to that with some questions later. ATM let's address the variable. My sources are always progressive so there's no deinterlace issues, there are mostly HDTV-rips with resolutions 624x352 approx. and my Monitor is set to 1680x1050.

    My Card is ATI HD3870 which i guess that doesnt make for a poor RAMDAC (or does it?).

    Are there any other such variables for consideration?
    "HDTV-rips with resolutions 624x352 approx." are well outside the conventional and are extremely low resolution maybe from HD source but all the HD has been squeezed out. For comparison, SD DVD 16:9 is stored 720x480 (852x480 expanded) or 720x576 (1024x576 expanded).

    An ATI HD3870 nonwithstanding, expansion from storage resolution is always lower quality vs. display from native resolution or downsized from higher resolution.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  20. Interlacing receives an unjustified bad press. If analog TV broadcasts were 29.97p instead of 59.94i, there would be protests against the flickering. Interlacing came about in the 1930's as a way to make the image look fluid but fit within the available transmission bandwidth. It works well with CRT's due to the persistence of the phosphor dots.

    I prefer 59.94i to 29.97p and when I travel to Europe, the TV seems to flicker very noticeably (50i). But I'm very sensitive to flickering - e.g., LED tail lights on cars that have been done on the cheap. I can even tell you the duty cycle.
    John Miller
    Quote Quote  
  21. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Yes, it takes me weeks to adjust to Euro PAL. Flicker is worst in peripheral vision.

    This used to be a point of controversy with our UK office until I hired an engineer from there to work in Califorina. After a few months here he went back to the UK and couldn't stand the flicker.


    BTW interlace gets you twice the channels in the same bandwidth. Would you give up half your 29.97i channels for progressive? Or same number of channels with jumpy 29.97p (no telecine solution) ?
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  22. I'd give up all my channels for just one worth watching
    Quote Quote  
  23. Member olyteddy's Avatar
    Join Date
    Dec 2005
    Location
    United States
    Search Comp PM
    CRTs are still very much used in broadcast. Just look at the wall of control room monitors in any news cast...
    Quote Quote  
  24. Some analog monitors can display progressive not your average TV set.
    And i suppose the opposite holds true, displaying interlaced content on a Digital Monitor. But is it the same havong a file stored with interlace information as having interlaced feed pass through cables or is there a difference?



    "HDTV-rips with resolutions 624x352 approx." are well outside the conventional and are extremely low resolution maybe from HD source but all the HD has been squeezed out. For comparison, SD DVD 16:9 is stored 720x480 (852x480 expanded) or 720x576 (1024x576 expanded).

    An ATI HD3870 nonwithstanding, expansion from storage resolution is always lower quality vs. display from native resolution or downsized from higher resolution.
    Wait though, isnt HDTV source better than DVD source? Cause it seems that hdtv-rips displayed on my screen look always worse than DVD-Rips of same resolution. Is the HDTV content broadcasted on channels worse than the DVD coming out (I'm talking in termsof TV Shows, which seems that DVD-Rips look better than HDTV-rips, but of course not better than the actual 720p or 1080p content they were actually ripped from)?

    or downsized from higher resolution.
    You mean like HD material downsized to display on SD resolution CRT's or other non HD resolution monitors?

    BTW interlace gets you twice the channels in the same bandwidth.
    What do you mean by channels, exactly?
    Quote Quote  
  25. Originally Posted by edDV
    BTW interlace gets you twice the channels in the same bandwidth. Would you give up half your 29.97i channels for progressive?
    Definitely, if I could pick the channels.
    Quote Quote  
  26. Member
    Join Date
    Jan 2006
    Location
    Northern Pacific SW
    Search Comp PM
    Originally Posted by JohnnyMalaria
    I'd give up all my channels for just one worth watching
    TCM?


    Has anyone found a marketplace for HD CRT's?

    I've given up on eBay for most things, and the HD CRT is not easily shipped.
    Quote Quote  
  27. Actually, TCM is probably top of the list!
    Quote Quote  
  28. Member edDV's Avatar
    Join Date
    Mar 2004
    Location
    Northern California, USA
    Search Comp PM
    Originally Posted by therock003
    Some analog monitors can display progressive not your average TV set.
    And i suppose the opposite holds true, displaying interlaced content on a Digital Monitor. But is it the same havong a file stored with interlace information as having interlaced feed pass through cables or is there a difference?
    LCD/plasma monitors only display progressive. Incoming interlace material must be deinterlaced or inverse telecined (for film material). Deinterlace is a lossy process.


    Originally Posted by therock003
    "HDTV-rips with resolutions 624x352 approx." are well outside the conventional and are extremely low resolution maybe from HD source but all the HD has been squeezed out. For comparison, SD DVD 16:9 is stored 720x480 (852x480 expanded) or 720x576 (1024x576 expanded).

    An ATI HD3870 nonwithstanding, expansion from storage resolution is always lower quality vs. display from native resolution or downsized from higher resolution.
    Wait though, isnt HDTV source better than DVD source? Cause it seems that hdtv-rips displayed on my screen look always worse than DVD-Rips of same resolution. Is the HDTV content broadcasted on channels worse than the DVD coming out (I'm talking in termsof TV Shows, which seems that DVD-Rips look better than HDTV-rips, but of course not better than the actual 720p or 1080p content they were actually ripped from)?
    Most of the time a higher quality source will compress better. This would include resolution, signal to noise and the number of times the source has been recompressed. TV broadcasts often get resized and/or recompressed several times so for that factor, a commercial DVD has the advantage.


    Originally Posted by therock003
    or downsized from higher resolution.
    You mean like HD material downsized to display on SD resolution CRT's or other non HD resolution monitors?
    I was saying the ATI HD3870 can't expand 624x352 for an acceptable SD or HD display. Upward rescale is just an interpolation.


    Originally Posted by therock003
    BTW interlace gets you twice the channels in the same bandwidth.
    What do you mean by channels, exactly?
    I meant broadcast channels but this applies to any form of storage or transmission. Progressive video contains twice the data vs interlace at the same frame rate.
    Recommends: Kiva.org - Loans that change lives.
    http://www.kiva.org/about
    Quote Quote  
  29. Originally Posted by edDV
    LCD/plasma monitors only display progressive. Incoming interlace material must be deinterlaced or inverse telecined (for film material).
    Then that must mean that they all come with some hardware deinterlace solution integrated, or else they would not display an interlaced image at all, like showing a blank screen i guess...

    Deinterlace is a lossy process.
    Why is that? Isnt it as simple as joining the two halves?


    Most of the time a higher quality source will compress better. This would include resolution, signal to noise and the number of times the source has been recompressed. TV broadcasts often get resized and/or recompressed several times so for that factor, a commercial DVD has the advantage.
    Yes of course but would that mean that a DVD would beat the actual 720p or 1080p broadcasted signal or just an hdtv rip such as the ones i'm accustomed to viewing?

    I was saying the ATI HD3870 can't expand 624x352 for an acceptable SD or HD display. Upward rescale is just an interpolation.
    You mean since the multiplier to the screens' resolution is not an integer? So if the file was exactly at half the resolution ie 840x525 would this make it that much better?


    I meant broadcast channels but this applies to any form of storage or transmission. Progressive video contains twice the data vs interlace at the same frame rate.
    You mean since a frame is a complete frame in comparison to interlace it's frame being a half of a whole one? So interlaced saves you bandwidth is that what you're saying?
    Quote Quote  
  30. Originally Posted by therock003
    Originally Posted by edDV
    LCD/plasma monitors only display progressive. Incoming interlace material must be deinterlaced or inverse telecined (for film material).
    Then that must mean that they all come with some hardware deinterlace solution integrated, or else they would not display an interlaced image at all, like showing a blank screen i guess...

    Deinterlace is a lossy process.
    Why is that? Isnt it as simple as joining the two halves?
    You can do it that way but it will look bad. A progressive frame is designed to be shown at one point in time (e.g., one every 1/30th of a second). Interlaced fields are displayed at different times (one field every 1/60th frame). So you need to do something to estimate what a single frame would be from the two fields. This inherently destroys information.

    Most of the time a higher quality source will compress better. This would include resolution, signal to noise and the number of times the source has been recompressed. TV broadcasts often get resized and/or recompressed several times so for that factor, a commercial DVD has the advantage.
    Yes of course but would that mean that a DVD would beat the actual 720p or 1080p broadcasted signal or just an hdtv rip such as the ones i'm accustomed to viewing?
    I think edTV is comparing standard def DVD to standard def digital transmission such as by satellite. Both are MPEG2 but the broadcast signal has to be compressed more so that enough channels can be squeezed into the frequency space available to the satellite.

    I was saying the ATI HD3870 can't expand 624x352 for an acceptable SD or HD display. Upward rescale is just an interpolation.
    You mean since the multiplier to the screens' resolution is not an integer? So if the file was exactly at half the resolution ie 840x525 would this make it that much better?
    LCD TVs have a fixed resolution - unlike an analog CRT monitor. Moreover, most LCD TVs have really stupid resolutions that mean almost every video format needs to be rescaled. You could do a simple integer-type rescale but it would look horrible. More fancy rescaling is necessary.

    I meant broadcast channels but this applies to any form of storage or transmission. Progressive video contains twice the data vs interlace at the same frame rate.
    You mean since a frame is a complete frame in comparison to interlace it's frame being a half of a whole one? So interlaced saves you bandwidth is that what you're saying?
    That is exactly the reason interlacing was invented. It provides an update rate of ~60 per second (50 in Europe) with the compromise of reduced resolution. Full resolution frames at ~30/25 per second have too much flicker. Cinema film projectors employ a somewhat similar trick to stop the 24fps flickering - each frame is projected twice to give an effective frame rate of 48fps.
    John Miller
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!