Actually, the Dell is fine. Always has been. Came out of the box fineIt is the Magnavox that is whacked. It is a really good analog NTSC monitor but connected to the RGB output of my Nvidia card it has always been WAY too contrasty. I'm talking about losing the first 20-50 steps (guestimated). And the contrast control on the Magnavox does not have anywhere near enough range to correct it. I have been using the control in the Nvidoa panel to make it look right (and they work) but they do not always load after a boot or a resume. And like you said, they just seem to be getting in the way. Yesterday I could not get Adobe Gamma (the only thing I have that can modify a profile) to work to save my life. But this evening I suddenly got it to work and the profile it created is loading just fine with the Nvidia controls disabled. Interestingly, the gamma target in the Adobe Gamma window did not agree with the subjective results just looking at a photo on the monitor so I went with my eyes and saved the profile and all was fine. Go figure. It isn't perfect but it is real close.
Again, the Dell is fine re: offcenter picture. It is the Magnavox that gets it wrong most of the time. If I force an auto adjust by selecting TV and the PC again, it always gets it right. But if I reboot, it almost always gets it wrong. The Magnavox has a couple software quirks and this is only one of them. It is a shining example of the new product cycle philosophy.... design it, buid it, ship it, and sell out before anyone is the wiser. By the time a customer identifies an issue, the production run is over and no one at Magnavox cares.
Someone (you?) mention a free calibration application somewhere above.... gonna look for it. I tried opening a calibration file in Notepad but it is not ASCII readable so I can't easily modify it by hand to get the Magnavox calibrated the rest of the way.
Paul
+ Reply to Thread
Results 31 to 42 of 42
-
-
ICM and ICC profiles are binary files. Even if you knew what numbers to enter, you're just guessing. The software gets those numbers according to (a) the answers/selections you choose running by-the-eye programs like Adobe Gamma, or (b) test patches analyzed with a color probe.
The "free" program I mention is HCFR, but no calibration software can work without a color probe (colorimeter). HCFR is run from a PC; most people install it in a laptop, plug the colorimetr into the laptop and take readings off a TV acreen. You can't use HCFR to calibrate a PC monitor -- mm...wait a minute. You can use it on a PC monitor to see what your RGB controls are doing. However, because you have no direct control over the slope or gain of your RGB curve -- you can only choose "more" or "less" of a color across the line, not in specific ranges -- then, of course, you can't replicate what XRite or Spyder software does). In any case, that won't help your Maggie. However, with a probe you could use HCFR to actually measure what the Maggie's doing.
This Magnavox, how old is it? It's a standard definition CRT, right?
What model Dell are you using?
I last used an nVidia graphics card in 1998 (!). I haven't bought a new graphics card since 2004 (ATI 9600XT AIW), but it does have 2 monitor outputs and s-video/composite (forget the latter outputs. Horrible). It works with 4:3 and 16:9 monitors and tv's. I hooked it up way back when. No problems, but never repeated the test.
Your graphic card likely works best when both connections have the same aspect ratio. If you have a 4:3 TV and a wide-screen PC, that can trip up many cards. But, then, I can't say: I've never been a fan of nVidia, but my experience with them is ancient. BTW, you describe your TV as an "analog NTSC" monitor. NTSC can mean analog SD, analog HD (yes, there is such a thing), digital SD, or digital HD -- all of those technologies are defined by NTSC in Northern America.
You're trying to make a CRT look like an LCD, and vice versa. Can't be done. Many people still use Adobe Gamma (I used it for years, and it drove me nuts). Adobe Gamma was designed for CRT's. Its ultimate goal in Photoshop was to match color and gamma curves for CRT-to-printer, not to other displays (You'll note that the file ending on your Adobe profiles is ICM, not ICC). The two types of monitors are alike in many ways, but dissimilar in many others. You can run the nVidia with a profile designed for your CRT, but the same profile or overlay won't work for both machines; they are too dissimilar for things to work that way.
If your Magnavox is anything like the typical CRT of a few years back, it is oversaturated (usually with pushed reds, a favorite with Sharp TV and a staple with Mitsubishi) and has very high gamma. You might be unaware that Magnavox didn't "make" this TV, they only put their name on it. In fact Magnavox never made tv's; they were a furniture factory. Years ago a Maggie living room console was really a Zenith wearing a Magnavox name tag. Remember that many CRT's had no talent whatever for displaying blacks below RGB 15, and many cut out at RGB 30. An LCD can display RGB 0 to RGB 255, but many CRT TV's can't be pushed that far; it's not just their electron guns that won't do it, it's protective electronic circuitry that won't allow it.
It's extremely difficult to match opposing technologies. Not that it can't be done at all, but you'll have to compromise at many points.Last edited by sanlyn; 20th Mar 2014 at 13:32.
-
You're probably using the correct adjustment control on the Maggie, but in case you aren't:
You adjust black levels with the "brightness" control, not the "contrast" control. Contrast refers to brights, brightness refers to darks. Yeah, I know. Confusing. Some monitors don't even use the terms bright or contrast. They use "picture" or "intensity" or some such. Bring up the customize dialog in PowerDVD and play with their contrast and brightness sliders. You'll never figure out what they're doing.Last edited by sanlyn; 20th Mar 2014 at 13:32.
-
-
The Magnavox is a three year old LCD ATSC 16:9 (actually 1440 X 900) television that includes a PC input (standard VGA). By "analog" I refer to the analog composite NTSC video input. THAT looks terrific. But video on the VGA input is horribly contrasty and the brightness and contrast controls do not have sufficient range to correct it. The OS (Win7) detects it as a "Generic PnP monitor". The Magnavox brightness and contrast controls seem to do roughly the same thing. The brightness seems to affect the entire range and the contrast does as well but if I look at the whites I can see that if I turn it up too far they burn to white. But NOTHING affects the black level, which is WAY too low.
I don't recall the Dell model number... it is about two years old, 23" 16:9, monitor only, with HDMI input.
I had some success yesterday in getting Adobe Gamma to change the gamma for the Magnavox and it looked pretty good. Then I made the mistake of trying some downloaded calibration software and when it wrote the calibration file, it affected BOTH monitors. I think I recovered from that with a system restore. But afterward, I could not get Adobe Gamma to work. It would affect the correct monitor while the program was running but as soon as I saved the calibration, it reverted to the previous contrasty display. I did get the calibration feature in Win7 to work by simply watching a photograph as I moved the gamma slider... the gamma targets did not agree with the actual display.
Something is very wrong here and I have not yet figured out what. The Dell behaves as expected when calibrating but the Magnavox does not. It is as though there is another calibration somewhere that is in series with what I am doing.
The Nvidia card is a GT210, and I think it might be time to read up on it and see if there is anything that explains the huge difference between HDMI and VGA outputs.
Again, the issue seems to be that I can't find a way to affect the black level on the Magnavox plugged into the VGA output of the GT210. And only because the correction done in applications like Adobe Gamma doesn't "stick".
I can fix this in an instant by reverting to using the GT210 controls... that ALWAYS works, but I'm trying to simplify and the GT210 sometimes forgets to apply the corrections after boot.
Paul -
Watch out for system restore. The best way to refresh your color management profile is to change it in your desktop Settings menu under the advanced adapter/monitor tabs. You likely have to reboot every time you do this. You should always be able to set the default sRGB profile or your monitor's OEM profile (Dell calls it a "driver"). SOmetimes when you set sRGB as default, Windows gives an error message. Set and reboot anyway; it's a bug in the Windows registry.
Thanks for the info on the monitors, I was way off track. Still, it's tough to make two different monitors look exactly alike, much less two different types.
Much also depends on what you're using to view your test patches. There are two applications that hook into your monitor profile: the Windows desktop, and Firefox. If you're using a picture editor of some kind to view patches for calibrating, note that most picture edit programs don't hook into your ICM or ICC proifiles. You can show test patches on your desktop by making the background color a dark gray (or even black) and centering the test patch as unstretched, unmodifed wallpaper on the desktop window.
PC graphics cards are notoriously iffy via a TV's PC or VGA input. It depends on the card. The gaming sites usually review cards that are optimized for TV display. Not every graphics card is tuned for TV. Many cards that advertize how well they work for TV are either lying or require special setup procedures. And then there's this: some TV's just aren't very talented working with a PC.Last edited by sanlyn; 20th Mar 2014 at 13:32.
-
(Ed.: Ooops, I forgot. When you're using Adobe Gamma, Adobe places an icon in your Startup area that loads the profile you selected in Adobe Gamma's dialog. You can stop it from loading on startup by doing this:
Go to START -> PROGRAMS and look for the STARTUP section in the program list. left-click that entry to get a popup menu, where you'll see Adobe Gamma listed. RIGHT-click on "Adobe Gamma", hold down the RIGHT button, drag the cursor and the icon to your desktop, then let go. This moves Adobe Gamma out of the startup area. The nominally safer way to do it is to to go to START, then the "Run" command, and type "msconfig" in the RUN input panel, then press ENTER. On the startup tab for msconfig, find Adobe Gamma in the Startup list an un-click the box to the left of that line.)Last edited by sanlyn; 20th Mar 2014 at 13:32.
-
I did some reading yesterday and discovered that indeed Adobe Gamma was starting when I booted so I got rid of it. Unfortunately it did not address the very dark Magnavox monitor display. I have a couple other things to try but I was interested by your response above that TVs don't always work well as monitors.
But the at least temporary bottom line is that.... I gave up on using profiles and went back to using the Nvidia control panel, which allowed me to tune both monitors perfectly. That also means that all applications should display the same? I think the main issue is that on both monitors, contrast seems to work (but has limited range on the Magnavox) but brightness seems to only affect the backlight. It does nothing whatsoever to the black level... the point at which the pixels are OFF.
This has been a fascinating, frustrating discussion!
Paul -
There's no single answer for the way Brightness and Contrast controls work on LCDs. A while back I scoured the 'Net and forums for answers to this because of a problem I had trying to repair a customer's PC (it turned out it was a graphics card defect, not the monitor. But don't take that to mean your nVidia's at fault). Some maintain that an LCD's Brightness control is really a backlight control, which would affect blacks, whites, and gamma alike. But others maintain those are decoder controls and have no backlight effect. LCD's have Brightness and Contrast controls that mimmick the same controls on CRT's, but CRT's had no backlights so those controls manipulated voltage and decoder circuits. But every article I found concluded that different LCD's work Brightness and Contrast in different ways. If the backlight is adjustable, it's usually done with the Brightness control -- but not always (!!!).
My SONY 32" LCD TV has Brightness, Contrast, Backlight, Gamma. My Samsung plasma has the same controls. It's anyone's guess just what those controls adjust, but I assume "Backlight" adjusts the backlight. That leads to the question: if Backlight adjusts the backlight, what does Brightness and Gamma adjust? Using a colorimeter and HCFR, I noted that Brightness adjustments affected the RGB curve by raising/lowering colors at the dark end on the SONY, but the same control had no RGB effects on the plasma.
I haven't tried that with my PC monitors. The LG PC monitor has Brightness, Contrast, and Gamma, but no backlight function. The HP monitor doesn't have a gamma control. Both monitors use the same e-IPS display panel, but one is an LED and the other is a CCFL backlight (the HP's CCFL has more distinct low blacks, which is not unusual). It's my understanding that PC monitors that use IPS or e-IPS panels don't have adjustable backlights or edgelights, unless they're pro-level sets.
Dell made a number of 23" PC monitors. The newer Ultrasharps use LG-made e-IPS panels (earlier versions used 8-bit S-IPS panels and cost about $100 more); the non-Ultrasharps have Samsung TN panels. My LG LED monitor has visible but mild bleed in the lower corners; adjusting brightness has no effect on it, so I'd assume it verifies that the LED edgelights aren't adjustable. Dell Ultrasharps aren't yet made with LED's, but I'm distressed to hear that Dell plans to go that way soon. What a shame.
To say that LCD's are all-digital devices isn't true. The front display manipulates LCD pixels by applying voltage variations, which is an analog operation. LCD's can control every aspect of the image by manipulating LCD cells alone without affecting the backlight. LCD panels emit light waves, which are analog phenomena. A digital device (encoder/decoder, etc.) might manage things, but light waves themselves are analog. Your eyeballs and eardrums can't do anything with digits.
All that rigmaroloe aside, there's no firm answer to just what Brightness and Contrast are doing on any particular LCD monitor. I've seen monitors that don't show really dark blacks as distinct; turning up brightness made the blacks brighter, but didn't make them any more distinct. It's for sure that if backlights are adjusted, they affect the whole spectrum -- but that doesn't mean they always affect "definition". Consider this: if you use your nVidia image controls to brighten the picture, it's for sure it can't do it by adjusting the monitor's backlight. Only the monitor itself can do that.Last edited by sanlyn; 20th Mar 2014 at 13:33.
-
I think this thread has veered off into a hardware topic. Anyone who wants to move it to another area is welcome to do so.
pgoelz: you're trying to do with eyeball and test patches what many users do with instruments and software. You can make definite improvements in your monitors using those methods. But it's really, really difficult. And some adjustments aren't possible with the controls at hand.
Below I've posted examples of what a colorimeter and a calibration package can accomplish with consumer-level gear. I didn't save test images on everything I did, but these are images from HCFR that I found type to keep.
Above: HCFR's picture of RGB grayscale measures from an LG IPS226V PC monitor. This is the result of using the Display2 calibration package's first step -- manual adjustment with the monitor's RGB controls. The software shows you three columns of small panels and a white test patch. The idea is to match the Red, blue, green column panels vertically, exactly in the middle of the small window in the interface. You likely can't align the colored panels exactly, but you do the best you can.
With manual adjustments, the colors ain't quite even but they sure beat what they looked like out of the box (Red was lined up somewhat above green and blue). The darks are on the left, brights on the right. You can see at the right-hand side that bright whites looked a little yellow (red + green, with blue lowered). The overall color balance actually looked a little red; that's because red was oversaturated (too bright) out of the box. Once you complete this manual step, the software takes over and works with your graphics card to create an ICC profile.
The PURPLE line at the bottom shows the level of color errors. An error of 0 is at the bottom of the scale, 9 at the top. An RGB of 2 is seen by humans as a little off-color. By 4, you definitely know that the color you asked for isn't "correct". By 9, the color errors are painfully obvious.
Above: the left-hand chart is the gamma response. Right-hand chart is the luminance response. These charts are the results of manual adjustments made using lagom gamma and brightness/contrast images.
Gamma (left-hand chart): A scale of gamma values is on the left border. The line with WHITE dots and squares is the target gamma -- in this case, 2.2. Later, I changed that target to 1.8, as many photogs do, mainly to more closely match the way my TVs show blacks. The BLUE (CYAN) line is the average gamma that was achieved with these settings (2.4, which is too high and obscures the darkest blacks). The YELLOW curve is the average gamma that was achieved for all three colors Red Blue Green. From the midtones upward, gamma is too high and makes those colors out-of-balance (brighter) than the darks. Bright detail tends to get burned out or appears over emphasized.
Luminance (right-hand chart): Blue follows the ideal, but Green and Red don't. This can make it lookas if you have a yellow (green+red) color cast, even if your RGB scale is balanced. The WHITE dots and squares are the ideal curve. YELLOW is the average achieved. Red Blue Green are the actual RGB measures. The idea is to get all three colors to follow the ideal (WHITE) luminance curve as closely as possible. Unless you have a pro-level monitor, you have no way of making those adjustments manually. The farther RGB colors move away from the curve, the harder it is to adjust brightness/contrast for an equal visual balance between darks, midtones, brights, and individual colors.
Above: after XRite/Display2 runs its automated steps and creates an ICC profile, it has corrected much of the RGB misalignment. The left-hand edge does have a slightly elevated blue; later calibration runs corrected this, for a much flatter response in the darks (sorry, I didn't save the chart and it's too much trouble to make one right now). With the RGB controls you get in most PC monitors, it isn't possible to achieve this. You can, however, with the more complex RGB controls on a few HDTV's. You'll note that the color errors are 2 or lower (PURPLE line), except at the left, which was corrected later.
The calibration run also corrects gamma and luminance levels.
Gamma (left) shows the results for a target gamma of 1.8 (I set that target manually). WHITE is the target, BLUE (CYAN) is the average level achieved. The YELLOW is a composite of the individual gamma responses of Red Blue and Green. It isn't as level as I'd like, but it's close and pretty good for a consumer-level monitor.
Luminance (right): There are actually 5 lines in this curve, White (target ideal), Yellow (average), and Red Blue Green for individual color luminance. All 5 lines are on the curve, right on top of each other. Couldn't ask for more. The target luminance level I requested was 140 cd/m2, which is a little brighter than CRTs (100) or LCDs (120). Because video on a TV generally looks brighter than on a PC monitor, I requested a slightly brighter response. Out of the box, this monitor had a brightness of 210, which is way too bright.
Above: A CIE colorspace chart for the saturation and color purity levels and values for primaries (Red Green Blue, in the corners) and Yellow Magenta Cyan (along the angles). This CIE chart is for the sRGB colorspace. The dark gray-lined triangle is the target sRGB, the white-lined triangle is the response for this calibration. Most consumer monitors won't match sRGB exactly, but what you see here is quite satisfactory. Red and Blue are a tad undersaturated, Green is a bit over. I've seen results from two cheap Acer monitors (but Acer makes better ones) that didn't even come close to this pattern.
Trying to get these results manually, with limited image controls, is a hard row to hoe.Last edited by sanlyn; 23rd Jan 2014 at 16:29.
-
Oh, I'm convinced the problem is NOT the graphics card. It seems to lie squarely with the monitor's lack of black level control. Neither monitor seems to be able to affect black level with its built in controls. The Nvidia controls can, and very easily.
All that rigmarole aside, there's no firm answer to just what Brightness and Contrast are doing on any particular LCD. I've seen monitors that don't show really dark blacks as distinct; turning up brightness made the blacks brighter, but didn't make them any more distinct. It's for sure that if backlights are adjusted, they affect the whole spectrum -- but that doesn't mean they always affect "definition". Consider this: if you use your nVidia image controls to brighten the picture, it's an internal adjustment in the card itself. It's for sure it can't do it by adjusting the monitor's backlight. Only the monitor itself can do that.
I'd suggest that if you want more certainty, you need a colorimeter and matching software.
I'd agree on the colorimeter and probe but I am not convinced that the Magnavox will obey a color profile. Gotta get that sorted first.
But for now they both look fine and very close to each other. Back to capturing videos. This has been a fascinating side trip
Oh, and the Dell is an ST2410 (not the ultrasharp version). It is connected to the GT210 via HDMI. Interestingly, on the banding test pattern, it shows very faint fine vertical bars. The Magnavox shows more pronounced vertical bars with subtle color shading. A slightly smaller Dell at work on a Dell PC using the internal graphics adapter and XP looks perfect. Go figure.
Paul -
PC packages won't work with TVs. But the probe can be used with other software (like the free HCFR) for that. Unfortunately, many TV's don't have enough controls, but some do (LG, Samsung, Pioneer). Some TV's have decent RGB controls (SONY, Panasonic, some Toshibas). I see where Vizio has a basic RGB setup.
If it appears on both monitors, but not on the monitors with different cards, it's the card. The Dell ST2410 uses a 24" TN-TFT panel made by either LG or Samsug.
There's no substitute for a decent probe and software. You wouldn't believe the time and angst that mine has saved since I finally bit the bullet and bought it.
Meanwhile, I have a huge video project as well, and it's yelling at me right now.Last edited by sanlyn; 3rd Jun 2011 at 12:12.
Similar Threads
-
Sharp VCR (or similar) S-VHS quality for best capture of my VHS tape?
By ruehl84 in forum Capturing and VCRReplies: 0Last Post: 19th Feb 2012, 15:52 -
Which $150 or under capture card for VHS/S-VHS -> computer?
By HDClown in forum Capturing and VCRReplies: 25Last Post: 16th Apr 2010, 22:16 -
VHS to DV capture: Component video vs. S-VHS
By vega12 in forum Capturing and VCRReplies: 8Last Post: 19th Feb 2009, 19:42 -
Capture device needed for old VHS or 8mm camcorder capture....What to get?
By thor911 in forum Capturing and VCRReplies: 11Last Post: 5th Oct 2007, 04:31 -
Please critique my proposed HTPC build
By wazakenn in forum Media Center PC / MediaCentersReplies: 15Last Post: 22nd Sep 2007, 22:09