I hope this is the correct category - there are so many !
My Samsung monitor (which I use as a TV) was driven by a Win7 laptop via blue RGB monitor cable - (resolution 1360x768).
Than I swapped the laptop out and installed a Asus Veriton N260g - much tidier & more elegant solution I thought.
Aha, the Veriton has HDMI says I - will be better picture and even less wires dangling out of the back etc etc.
Duly bought HDMI cable, but it sent resolution haywire and picture was much more blocky. 1360x768 res was no longer available.
It set itself to another res saying this was the native res of the monitor. Any other res options I tried, resulted in a small desktop picture in the middle of a large screen. none of which were satisfactory
When forced to revert to original RGB cable, everything was back to normal without input from me. Native (recommended) resolution was back at 1360x768
Just one of the many thing that puzzled me is - how can a monitor be perceived as having different 'native' resolutions ?
Surely HDMI should be a better method than old-school RGB cable ?
Can anyone shed light / offer solution / advice ?
+ Reply to Thread
Results 1 to 22 of 22
However, only resolutions supported by both the GPU and the display will be available to select as output. It is possible that the Veriton N260g doesn't support 1360x768 output via HDMI or that your monitor supports 1360x768 resolution input for VGA but not for HDMI. Check the monitor's manual to see which resolutions are supported via HDMI.Ignore list: hello_hello, tried, TechLord
I did some research on the "Asus" Veriton N260g. First, it turns out that "Veriton" is an Acer product line, not an Asus product line. The Veriton N260g was released in 2009 and has an Intel Atom N280 / 1.66 GHz CPU with graphics provided by an Intel GN40 chipset.
The GN40 chipset seems inadequate for playing back some types of video that are common in 2019. For example, apparently it is OK for decoding 720p video but not great at decoding 1080p Blu-ray. Maybe you need something a little newer to use as an HTPC.Ignore list: hello_hello, tried, TechLord
For what it is worth, I only had AMD's 785G integrated graphics chipset or Intel HD Graphics 4200 and their standard driver interface plus Windows 7 to work with and didn't need to do anything unusual to get my TV and the GPU on either PC working together at that resolution.Ignore list: hello_hello, tried, TechLord
You can add custom resolution you may rely on driver capabilities but without knowing how EDID provided by HDMI sink looks you can't do too much.
I agree, driver can limit HDMI, sink EDID can limit HDMI - HDMI provide 3 different way to select used resolution.
Thanks for input folks.
I did a quick search on 'Video ID Code' & 'Detailed Timing Descriptor', but this all seems too technical. It would take weeks just to understand the backstory on the jargon used in the articles I found.
The veriton is NOT very powerful ( only use it for Netflix) but is much better since I changed the graphics control panel power setting to 'max performance'. - CPU now down to 60% from 99% previously. Ram is still at about 40% so big enough - faster ram may help.
So for now, I am back to the original question - "should HDMI produce better monitor picture than the old-school VGA / RGB blue cable (with 'D' shaped spigot on ends).
which I am back to using since the HDMI experiment failed ?? part of the problem with HDMI cable was that the desktop was pushed over the edges of the screen, I guess this was the resolution over-filling the physical screen?
I do have possibilities to set custom res. There are options in windows (with warnings that meddling may cause system to detonate etc) - (thanks for link to article)
Also the graphics control panel (up from taskbar) seems to give other options for custom res.
But if the HDMI would not be expected to produce SIGNIFICANTLY better results than the blue-end RGB / VGA cable, then all the mucking about would just be a waste of effort. ?
@herder Reviewing your initial post, it is not clear whether you have tried setting a custom resolution or have only made selections from the list of resolutions provided by Windows. If you haven't tried setting a custom resolution, here are instructions for a PC running Windows 7. https://www.techwalla.com/articles/how-to-have-custom-screen-resolution-on-windows-7Ignore list: hello_hello, tried, TechLord
I agree with jagabo regarding the picture. Having used both HDMI and VGA to connect one of my computers to my TV at the TV's native resolution, I found VGA produces a slightly softer picture. The other advantage is that HDMI can also supply audio for the TV speakers if the computer's audio card/drivers support it.
Last edited by usually_quiet; 17th Aug 2019 at 21:30.Ignore list: hello_hello, tried, TechLord
I have a Mac Mini where HDMI adversely affects system color. It's fine via DVI or VGA. The HDMI is the problem. Not the wire, maybe not the port, but something internal.
HDMI stole your sanity?
Well, for several weeks about a year ago, I was there myself.
Another thing, I did not pay huge money for this HDMI cable.
Then saw ad saying "super high speed HDMI" at prices 3 times what I paid. - are there different grades / speed ratings for HDMI cables ? or was that just sales hype ?
would a faster / expensive cable give better performance ?
Make sure your disable the overscan feature of your TV too. Otherwise you won't get pixel-for-pixel mapping even if you set the HDMI resolution to the TV's native resolution (the TV will zoom in and crop off the edges).
And the digital-to-analog-to-digital conversion using an analog cable may result in increased (or decreased) contrast/saturation which may be interpreted as better image quality, even though it is less accurate.