I am really confused about which monitor inputs are preferable. Every monitor seems to come with VGA inputs but, some monitors also come with DVI-D, some come with HDMI, some come with both. I have read that DVI-D and HDMI are the same for video (http://hdmivsdvi.com/). I have also read that you cannot go from PC DVI-D to HDMI on a monitor even with an adapter (http://www.newegg.com/Product/Product.aspx?Item=N82E16812270286). I have also read that you can. What is up? I have also read that the VGA connection has really high resolution and is not an issue!
Basically, I have an Acer 23" monitor with 2- HDMI and 1- D-Sub (VGA) inputs. I just bought a DVI to HDMI adapter at Newegg to run from my Nvidia Quadro FX4600 DVI port to the Acer HDMI port. I hope it works. I assume it will give me a preferred connection vs my Quadro to VGA on the Acer but, I am not sure that it will. I also need to buy a second monitor. Should I buy one with HDMI or DVI-D if I cannot get a monitor with both?
+ Reply to Thread
Results 1 to 10 of 10
Thread: Monitor Inputs - DVI-D vs HDMI
Last edited by videobread; 15th Apr 2012 at 11:18.Depends what the definition of the word inhale is.
Generally, HDMI is for television, DVI for computers. You can use a DVI->HDMI adapter to go from a DVI-D or DVI-I port to an HDMI port (you will get video only, no audio). Possible exceptions would be when using software that requires HDCP (Blu-ray players, for example) when the DVI port doesn't support HDCP. Some HDMI inputs may not accept as wide a variety of frame sizes and refresh rates.
Strange, I'm seeing more monitors with DisplayPort instead of HDMI. Most monitors these days should at least have one DVI input though, and that's still pretty much a standard output of video cards. From what I've read it's best to find video cards with DisplayPort outputs as DP can be converted into HDMI, DVI, or VGA with the right adapter so it offers the best flexibility of display options.FB-DIMM are the real cause of global warming
I have never really understood what people are paying the big bucks for with monitors. I need to buy another and would like to upgrade to something nicer. What specs are important other than getting all the input options - VGA, DVI and HDMI? I just don't get why people are paying over $160 for a 23". What am I missing?Depends what the definition of the word inhale is.
I do recommend LED-backlit displays for a number of reasons. They use a lot less power than CCFLs which also means less heat. LEDs also come on almost instantly and have quite a large range of brightness (which, depending on the LCD panel, helps increase gamut and get better color representation). Now that their prices have come down to about even with their CCFL predecessors there's not much reason not to go with the older LCDs unless you're only interested in the immediate cost (as the CCFL will cost you more in energy over its lifetime).
I just picked up three Dell U2412Ms for home as I often have multiple windows open for design/layout work and don't always like having to alt-tab between windows just to read something. I also thought it would be fun to try gaming in 5760x1200. There aren't many 16:10 LED displays on the market, and those there are have a high price, but these Dell's can be found for right around $300. That extra 10% of screen space is useful and the added pixel density over a 16:9 24" is better for closer viewing. The next step up is the 2560x1600 resolution but those are 27" and higher and more than double the cost of my 24" monitors so that's about where I draw the line. I'd rather have the spanned displays than one large one for the price.FB-DIMM are the real cause of global warming
Short summary of differences:
*HDMI is primarily for interfacing with CONSUMER ELECTRONICS (e.g. TVs, Blu-Ray players, Receivers, etc). It is a digital signal, has VERY HIGH data rates, supports HDCP (content protection/encryption) and Audio, and also can support other features such as Deep Color, 3D, Ethernet, etc.
*DVI is primarily for interfacing with computer monitors. It can be Analog (DVI-A), or Digital (DVI-D) or both (DVI-I), but is mainly digital these days. It can have Quite High data rates (though possibly not as high as HDMI or DisplayPort), and can OPTIONALLY/OCCASIONALLY support HDCP. It doesn't support audio and those other features, and 3D is only supported in "2D-compatible" modes.
*DisplayPort is a competitor to DVI (and somewhat also to HDMI) interfacing to computer monitors. It is a digital signal and can have VERY HIGH data rates. It is not known to support HDCP (IIRC) or audio, etc.
*VGA is for interfacing with computer monitors & some TVs. It is an Analog signal, with Medium/High bandwidth, and doesn't support any of those other features.
**Digital vs. Analog: If you go analog, you lose the highest levels of resolution & sharpness/clarity/quality and run the risk of certain kinds of artifacts (moire patterns, etc). This is particularly true with direct pixel mapping type monitors (LCD).
Your monitor manual should list the supported resolutions for the various ports. Computer monitors generally lack support for interlace video or frame rates other than 59.94/60Hz.
DVI backward compatibility is mandatory for HDMI - there is no difference between HDMI and DVI except few but sometimes very important details.
HDMI can support less video modes which are compatible with typical PC modes i.e. all VGA related modes can be not supported except 640x480p60 which is mandatory.
HDMI can work with not exactly pixel; to pixel representation on display i.e. HDMI can assume that incoming video can be slightly cropped and enlarged (video at border part can be lost) - support for Overscan data i.e. PC graphic is NOT mandatory for HDMI.
HDMI can prefer YCbCr mode instead RGB.
HDMI can prefer 16-235 instead 0-255 level range.
Some aspects of HDMI are optional and this explain why some TV's behave different when PC video is on HDMI or DVI/VGA input (assuming that TV support for example HDMI and separate DVI-D or VGA interface) - usually this is related not to hardware or interface but manufacturer implementation.
I would go with displayport.
Q. Is DisplayPort Better For Home Theatre Installations?
A. DisplayPort is ideal for in-room and room-to-room home theatre installations due it’s native support for fiber optic cables, support for standard cables up to 15 meters in length, and it’s Blu-ray Disc ready support for surround sound audio, HDCP, and bi-directional control. DisplayPort also features optional latches on cables to prevent plugs from pulling out unintentionally from home theatre equipment.tgpo famous MAC commercial, You be the judge?
Originally Posted by jagabo
The OP has an Acer computer monitor with HDMI in and display card with DVI-D out. Display port isn't an option in this case.
A typical computer monitor implementation of HDMI is essentially identical to DVI-D in features. That means RGB, progressive video only and usually one to one pixel mapping.
An LCD-TV can add many HDMI features including interlace and YCbCr video support, bi-directional audio/control, Ethernet, 3D, etc. A downside is LCD-TV HDMI inputs usually default to video overscan but may have a one to one pixel mapping mode in settings.