OK, I have a system set up that had a display adapter built into the motherboard. I have since then installed an ATI AIW Radeon 8500 mainly for capturing video. I would like to run a piece of software that cannot function with this card (the card is too advanced for the software to work with it, so troubleshooting said).
Now, going into my Device Manager, I have the option of Disabling my video card - but there are no other alternate display adapters listed. My question is: What will happen if I disable my ATI card? I assume it's functioning as the primary display adaptor - so will turning it off default back to the original display adaptor? Or will the computer think there's nothing there and be unable to give me a display?
(Sorry if this is an idiot question, I just don't know).
Thanks.
+ Reply to Thread
Results 1 to 2 of 2
-
-
I would have thought if you delete it, reboot and then don't re-install the drivers then you'll have the default drivers.
Willtgpo, my real dad, told me to make a maximum of 5,806 posts on vcdhelp.com in one lifetime. So I have.
Similar Threads
-
Using an Onboard video card with a PCI video card
By Nastydave in forum Media Center PC / MediaCentersReplies: 3Last Post: 6th Apr 2012, 02:37 -
Disabling Divx De-interlacing
By Sliztzan in forum EditingReplies: 23Last Post: 8th Jun 2011, 23:19 -
[VHS conversion] Sound Card, Video Card?
By bessiecat in forum Capturing and VCRReplies: 6Last Post: 26th May 2011, 13:02 -
Help with Disabling Subs by Default
By TheAlmighty in forum SubtitleReplies: 6Last Post: 16th Dec 2009, 03:52 -
Looking for new driver for video card- MEDION (7134) WDM Video Capture card
By 2prfm.com in forum Capturing and VCRReplies: 3Last Post: 15th Sep 2008, 02:48