Hey guys,
I have the ATI AIW 9800 Pro, I need to get a dvi cable to hook it up to my tv, I see that theres DVI-D, DVI-M1 and a few more, which one do I need ? DVI-D?
I have this tv here, http://www.ncix.com/products/index.php?sku=21146
+ Reply to Thread
Results 1 to 11 of 11
-
-
I'd suggest VGA at 1366x768p native resolution of the LCD panel.
DVI-D connection @ 1280x720p would require rescaling at the TV to 1366x768p and would show overscan. -
For PC purposes it's a monitor, not a TV. If you use DVI-D your PC should detect it and set its resolution to 1366x768. If there are drivers made available for that monitor from Acer than you may want to install those before attaching the LCD. If you're playing SDTV, like from Media Center or something, then your PC will scale the video to fit the TVs resolution (while maintaining the source aspect ratio).
DVI cables can be expensive when you're purchasing one from a home theater place, try to shop for one at a computer outlet.FB-DIMM are the real cause of global warming -
Originally Posted by rallynavvie
Nothing on the Acer site about drivers. No manual either. The driver would be ATI Catalyst for the 9800Pro.
TV sets don't usually have manufacturer supplied drivers. The HDMI port is standardized for 476i, 576p, 720p and 1080i (sometimes also 1080p but not likely for that set). The TV will accept one of those and scale or deinterlace as necessaary to the native 1366x768. 1280x720p is the closest resolution. The TV would then need to scale 1280x720 to 1366x768.
VGA could be directly output to 1366x768 via the 9800Pro RAMDAC without rescale in the TV. The quality should be better, there would be no TV generated overscan and cable would be cheaper but should be kept reasonably short. In this case deinterlace would be performed in the 9800Pro or software video player.
The 9800Pro doesn't support 1080i over DVI. The only way to get 1080i is via the analog component adapter. 1080i isn't necessary and actually would be an inferior connection for that set since it would force an unnecessary destructive deinterlace and downscale.
The HDMI ports are better used for a TV tuner or DVD player.
Originally Posted by rallynavvie -
So edDV,
You're saying I should use vga instead of dvi ? People told me that DVI would have a better resolution but I suppose if it has to overscan where as the vga outputs directly on the max resolution, then you have a point.
I have a question though, my card has this connector,
where as my tv has the
would the - end one work if i connect it from my tv which has that end to the ati card which has the + end or would would I need 1 end to be - and the other to be + ?
Also one more thing, keep in mind that the ati card has a default dvi output so in order to get vga ( which is what i'm using right now ) I have to use the adapter that came with the card, would that lower the resolution since its already going from dvi-vga ? Should I just connect it to the vga part of my tv ? Since I've got both ports it might just save the headache ?
Hmm I just noticed that my vid card came with an adapter that goes from svid to hd component, ah I'll probably save the hd component for something thats actually hd. -
The Acer TV spec said you have two HDMI connectors and no DVI-D. Is this true?
The connector on your 9800Pro is called DVI-I. It has pins for both DVI-D and VGA. Most ATI 9xxx cards come with a DVI-I to VGA adapter which taps the VGA pins. Then you use a standard VGA cable. If this cable needs to be >10ft, use a quality cable.
If your TV has a DVI-D connector and you have the manual, tell us what the manual says about accepted resolutions. Most LCD TV of that level want digital video (480i, 480p, 720p, 1080i) on the DVI port and computer VESA resolutions (including 1366x768) on the VGA port.
Quality is best if you can feed that TV at its native resolution* without all the conversion.
YPbPr will not improve quality vs VGA. It is another alternative but the TV will probably overscan that input.
* This assumes of course that 1366x768 is an accepted resolution on the VGA port. Check the manual. -
Why does my friend's Samsung LCD TV support 1366x768 via DVI then? It was plug-and-play via the DVI cable and automatically set it to the 1366x768 res. Has it something to do with the HDCP denying any signal that isn't HD standardized?
The DVI to VGA adapter that came with your video card should work just fine, you're not really getting any signal loss from it, though as edDV said you will want to keep your VGA cable shorter than 2m.FB-DIMM are the real cause of global warming -
Originally Posted by rallynavvie
The DVI/HDMI port will always overscan ~5-10%. So if it connects 1366x768 you are not at the same resolution as the 1366x768 display. It would be upscaling to something 5-10% larger than 1366x768 and thus blurring the cropped LCD display at 1366x768. This is why they suggest computers be connected as VGA.
VGA can go out to 25ft or more if quality cables (extra shielding) are used. I'm talking the $20-40 cables not the extreme stuff. 1366x768 is WXGA.
http://en.wikipedia.org/wiki/Wide_XGA
http://www.av-cables.net/VGA-component-video/vga-cables-2.html
PS: Keep in mind that this discussion relates to LCD-TV and some projectors. They have constrained input resolutions and don't "multisync". They also overscan. They are different than computer monitors which accept square pixel VESA resolutions on DVI-D. -
Oops sorry I forgot about the A in my tv's model its actually the 3720A, I forgot that mattered but when you bought up the 2 hdmi, I knew we were talking about 2 different tvs, my tv has 1 hdmi,1 vga, 1 dvi, 2 hd components and 2 regular component, s-video as well, it only list 1 resolution in the whole manual and thats 1366x768. Although there are modes to make it 4:3, 16:9, Panorama or Letterbox. So you're saying that even though I'm going from the original source which is dvi to the vga adapter, it wont have much effect on the quality ?
I'm not sure where I want to place the computer yet, but I'm planning to buy long cables just in case I want it far so its not in the way when I look at the tv,
I was looking at something like this
http://cgi.ebay.com/10FT-SVGA-Super-VGA-M-M-Monitor-Cable-w-AUDIO-10_W0QQitemZ22010070...QQcmdZViewItem
Its a svga cable.. would it work for my vga ? -
Originally Posted by StoneColdWhat
VGA outputs from a Frame Buffer + RAMDAC that can exactly match the WXGA 1366x768 (16:9) screen size of your TV. Most LCD-TV sets don't overscan on the VGA "computer" input. The other output standards would all need additional scaling at the TV and they would overscan since this is a TV.
http://en.wikipedia.org/wiki/Overscan
Originally Posted by StoneColdWhat
http://www.av-cables.net/VGA-component-video/vga-cables-16752.html -
Are all vga cables about the same unless they're double shielded ? Also I was just looking at the connectors on the adapter and on the tv, they both have there holes to put the screws in on the side, I take it that I need the male connectors right ?
Similar Threads
-
Can I interface my VGA monitor through a HDMI-DVI-D cable via DVI-D to VGA?
By vinny88 in forum Authoring (DVD)Replies: 4Last Post: 14th Oct 2011, 08:31 -
Question about DVI-D & DVI connections ?
By VEBouto in forum ComputerReplies: 2Last Post: 27th Oct 2010, 08:28 -
VGA to DVI-D cable
By ljCharlie in forum DVB / IPTVReplies: 22Last Post: 13th Jan 2010, 13:07 -
Cable to DVI?
By cfgcjm in forum Newbie / General discussionsReplies: 3Last Post: 6th Aug 2008, 16:25 -
DVI-D cable in DVI-I slot-possible? Quality loss?
By bigshotceo in forum DVB / IPTVReplies: 1Last Post: 21st Dec 2007, 06:49