One of my monitors is a Dell u2711 (2560x1440 resolution).
It runs fine with a DVI cable, but I can't get it to work with an HDMI cable. Would upgrading my GPU solve this? (My current GPU is 14 years old!)
Try StreamFab Downloader and download from Netflix, Amazon, Youtube! Or Try DVDFab and copy Blu-rays! or rip iTunes movies!
+ Reply to Thread
Results 1 to 7 of 7
Thread
-
-
You've mentioned your monitor, which has HDMI 1.3, but identifying your graphics card - beyond stating its age - would also be useful.
By the way, have you checked whether your hdmi cable is OK?Lenovo ThinkStation P520, Xeon W2135; Win10Pro x64, 64Gb RAM; RadeonPro WX7100W; NEC PA301W, NEC PA272W, and Eizo MX270W. -
Also, you should be sure the HDMI port of your video card IS working.
"Programmers are human-shaped machines that transform alcohol into bugs." -
Sorry, I should have said this. It's a Sapphire ATI Radeon 4670 HD 750Mhz.
I bought the cable back in 2010 and it worked fine with a 1080p TV. But I don't currently own any other HDMI devices so it's not easy for me to check.
Thanks, how might I do that? -
-
Your monitor's product page on Dell's website says that it can process up to 12-bit color input. (If that is true, the monitor has to tone map 12-bit color input for display on a 10-bit equivalent panel. That's because your monitor has an 8-bit + FRC panel, which is supposedly a cheaper equivalent to a 10-bit panel.)
The monitor has an HDMI 1.3 input port. The ATI Radeon 4670 HD 750Mhz VGA card has an HDMI 1.3 port, just like the monitor does. HDMI 1.3 has a maximum data rate of 8.13 Gbit/s and should support 2560 × 1440 resolution using 36 bits per pixel (12-bit color depth) at 60Hz. So, a 4670 HD 750Mhz VGA card should be good enough for use with your monitor.
Maybe the HDMI cable is an older design that is fine for full HD but can't handle something that is closer to the maximum data rate of 8.13 Gbit/s supported by HDMI 1.3.Last edited by usually_quiet; 3rd Mar 2023 at 20:07. Reason: correction
Ignore list: hello_hello, tried, TechLord, Snoopy329 -
Lol, you reminds myself.
I use as primary monitor, only a VGA output.
And HDMI cable just laying of somewhere..
As my primary card is an 5450 from 2011 as i can remember, is still pretty good
for w10. Also if you want only high-res video output, switch to HDMI directly..
Try to find your GPU drivers, from Gpu-Z or cpuz, run driver updates and you go on
Newer/latest Displays cannot support out-dated gpu's anymore.
Similar Threads
-
HDMI splitter to bypass HDCP 2.2 for Full HD monitor
By TheCount_180 in forum ComputerReplies: 10Last Post: 30th May 2021, 19:56 -
Cable box > HDMI audio receiver > HDMI splitter > Capture card?
By ArianK in forum AudioReplies: 0Last Post: 6th May 2021, 03:18 -
Hybrid not detecting GPU features/Won't allow certain GPU accelerations
By 60fpshacksrock in forum Video ConversionReplies: 11Last Post: 19th Mar 2021, 02:15 -
Upgrading to VHS over HDMI Transfer from Composite
By Allegedly in forum CapturingReplies: 7Last Post: 15th Dec 2020, 12:36 -
Leawo Video Converter GPU NVIDIA Faster than handbrake GPU NVIDIA.
By artkazuma in forum Newbie / General discussionsReplies: 3Last Post: 24th Apr 2020, 03:30