Hello friends, I am seriously confused and hope someone can assist. I used a Sony 32" 720 HDtv on my computer as a monitor, videos played smoothly, youtube, VLC player, Debut video capture worked smoothly, captured at 29 fps.
Same computer, same video card, I changed monitors to an AOC 24" 1080, now video's play poorly at 1080, jerky, freezes, 1080 movies won't play on VLC player, freezes up, super pixelated, specially 1080 vids, when they somewhat play. So, I reduce resolution down to 1280x720, still same thing. I update graphics driver, same thing. Debut video capture, only captures at 1 to 2 fps. Waydown from 29 fps,.
Put older Sony 32" tv back on computer, works fine again. Anyone have any idea, why an older tv (vga) works better then a new 1080 monitor (hdmi) ? 1080 quality is awesome, desktop, browsing, etc... but all types of issues with videos.
My video card is NVidia GeForce 210,.
Is it possible my video card, cannot display 1080 properly, although 1080 is in Nvidia control panel ? Thanx for assistance.
+ Reply to Thread
Results 1 to 11 of 11
Thread: Baffling monitor quality
Your video card was released in 2009, and should be able to display video on a 1920x1080 display without producing the symptoms you describe. The problem may lie with your settings in the affected software, or with your CPU.
Because VLC uses software decoding, when the CPU isn't up to the task of decoding the video you are playing, it will stutter and freeze. Have you turned on "Accelerated video output (overlay)" on VLC's "Video Settings" page, which allows VLC to use GPU hardware acceleration to assist the CPU with decoding? (To find out, click on "Tools" on the menu bar, then click "Preferences" and select "Video".)
The fact that Debut Video Capture doesn't work properly could also mean the CPU is being used to decode your video instead of the video card's hardware decoders. Debut Video Capture supports GPU hardware acceleration, but it may be necessary to turn it off to use its screen capture functions to successfully record a video overlay. Is hardware acceleration turned on?
first i'd try re-installing the latest nvidia driver with the monitor attached. make sure the resolution gets set to the monitor default. if that doesn't help the video card is kinda old and very low performance.--
"a lot of people are better dead" - prisoner KSC2-303
you're correct - yours is about a 150 on the g3d scale and his about 180. iirc the 210 has no fan and might overheat if pushed. he could try vacuuming the heatsink.--
"a lot of people are better dead" - prisoner KSC2-303
Thanx for helpful answers/information. Ok, now I reduce 1080 monitor (solo) to 1280x720, reinstall drivers, works fine ! Seems to have difficulty with dual monitors. Previously I also had an HP 25" on the same card, dvi output. Card has difficulty with dual settings 1080, both monitors. Yes, VLC accelerated video is on.
Don't think card can handle dual 1080 hdmi/dvi output, worked fine with vga/dvi output on sony tv and 25" HP at 1280x720,. All the problems begin when I try to run dual 1080,.
My cpu is AMD dual core 2.7Ghz.
Just check'd system information, Conflicts/sharing, don't know if it has anything to do with my issues, but here's info:
Memory Address 0xC0000000-0xDDFFFFFF PCI standard PCI to PCI bridge
Memory Address 0xC0000000-0xDDFFFFFF NVIDIA GeForce 210
Memory Addres 0xA0000-0xBFFFF PCI bus
Memory Addres 0xA0000-0xBFFFF PCI standard PCI to PCI bridge
Memory Addres 0xA0000-0xBFFFF NVIDIA GeForce 210
I/0 Port 0x000003B0-0x000003BB PCI standard PCI to PCI bridge
I/0 Port 0x000003B0-0x000003BB NVIDIA GeForce 210
Graphics cards with only three available video connections consisting of DVI, HDMI, and VGA, usually don't work well using the two digital connections at the same time because, as you suspected, the digital connections share resources. Mine also requires that one of the two monitors in a dual-monitor configuration uses VGA, or it won't work correctly.
Last edited by usually_quiet; 15th Mar 2013 at 22:15.
If you enable it, rather than assist the CPU decoding, the GPU should basically do all the decoding... depending on the video type and what your player/video card supports.
I think you'd find if you video card's outputs looked like this, you'd be able to use both HDMI and DVI at the same time, or HDMI and VGA, with the use of a DVI to VGA adaptor.
Thanx once again to all those for helpful info.
Hello_Hello believe we may have found the problem. It's a cable issue. Earlier this evening, I tried to watch a vid on 1080 hdmi monitor had issues, moved player to 1080 dvi monitor, it work'd. So sunday will try switching HDMI cable to vga, see what happens. Amazing how much of a difference cables makes. Sometimes we believe better quality products are better. Guess my system is not ready for HDMI video display
I've not used a video card with a combination of HDMI, DVI and VGA, so given it was mentioned earlier the digital connections share resources, I assumed that was correct. Maybe they don't though, as long as you only use two outputs at a time. My video card's older than yours and it has DVI, VGA and TV outs (one of the round din plug type connections for TV out) but I can't say I've ever used the TV out at all, so I haven't tried running all three at the same time.
Nvidia lists the specs for your card as being the same as mine (8600GT) in terms of resolution, so 1080p should be a walk in the park.
Maximum Digital Resolution 2560x1600, Maximum VGA Resolution 2048x1536
If you do happen to work out whether it's the cable and you can or can't use HDMI and DVI at the same time, could you report back? I'm kind of curious because Nvidia don't make mention of it so I'm starting to suspect maybe you should be able to.
HDMI cables basically either work, or they don't, there's generally no in-between. I've lost track, but I assume you've confirmed the "problem" monitor works fine when connected at 1080p on it's own (single monitor mode)? Or have you tried swapping the cables around and using DVI instead of HDMI? Although I'd guess you might need a HDMI to DVI cable for that, depending on the types of connectors the monitors have. I'm just wondering if it turns out not to be the cable, could it be the monitor itself? Does Windows display okay or does it not display correctly at 1080p either?
Did you sort out the hardware decoding thing? If you happen to be using XP, MPC-HC supports DXVA/hardware decoding when running on it. Not that I think it has anything to do with the problem....
Hello friend, believe I had a faulty cable or card couldn't handle hdmi on dual. I bought a matching monitor (vga) cable, other on dvi, all is well now, but I do notice a slight difference in color, using exact dual monitors, both monitors equal setting. Was going to upgrade to a newer card, but my system doesn't have 500W power supply.