HI
I hope this is the correct category - there are so many !
My Samsung monitor (which I use as a TV) was driven by a Win7 laptop via blue RGB monitor cable - (resolution 1360x768).
Than I swapped the laptop out and installed a Asus Veriton N260g - much tidier & more elegant solution I thought.
Aha, the Veriton has HDMI says I - will be better picture and even less wires dangling out of the back etc etc.
Duly bought HDMI cable, but it sent resolution haywire and picture was much more blocky. 1360x768 res was no longer available.
It set itself to another res saying this was the native res of the monitor. Any other res options I tried, resulted in a small desktop picture in the middle of a large screen. none of which were satisfactory
When forced to revert to original RGB cable, everything was back to normal without input from me. Native (recommended) resolution was back at 1360x768
Just one of the many thing that puzzled me is - how can a monitor be perceived as having different 'native' resolutions ?
Surely HDMI should be a better method than old-school RGB cable ?
Can anyone shed light / offer solution / advice ?
Steve
Try StreamFab Downloader and download from Netflix, Amazon, Youtube! Or Try DVDFab and copy Blu-rays!
+ Reply to Thread
Results 1 to 22 of 22
Thread
-
-
1360x768 is not standard HDMI resolution, try to force your HDMI link to DVI (you will loose audio) but you gain possibility to use native resolution, alternatively try to force your notebook graphic card to use non standard HDMI resolution (i assume currently you can select either 1280x720 or 1920x1080). both proposals assume your source must be able to be configured - this depend highly on your graphic card, each graphics vendor is different.
-
HDMI itself readily supports 1360x768 output. I have had 2 different Windows PCs connected to a 2011 720p TV with 1360x768 native resolution using HDMI at the TV's native resolution. One PC dates from 2009, and the other dates from 2014.
However, only resolutions supported by both the GPU and the display will be available to select as output. It is possible that the Veriton N260g doesn't support 1360x768 output via HDMI or that your monitor supports 1360x768 resolution input for VGA but not for HDMI. Check the monitor's manual to see which resolutions are supported via HDMI.Ignore list: hello_hello, tried, TechLord, Snoopy329 -
I did some research on the "Asus" Veriton N260g. First, it turns out that "Veriton" is an Acer product line, not an Asus product line. The Veriton N260g was released in 2009 and has an Intel Atom N280 / 1.66 GHz CPU with graphics provided by an Intel GN40 chipset.
The GN40 chipset seems inadequate for playing back some types of video that are common in 2019. For example, apparently it is OK for decoding 720p video but not great at decoding 1080p Blu-ray. Maybe you need something a little newer to use as an HTPC.Ignore list: hello_hello, tried, TechLord, Snoopy329 -
-
Regardless, using HDMI still isn't exclusively the cause of the problem. Plus, given that the graphics are provided by an Intel GN40 chipset, I'm somewhat doubtful that much in the way of advanced settings are available. Also, given the form-factor of the computer, there is no way to install a VGA card and upgrade the graphics.
For what it is worth, I only had AMD's 785G integrated graphics chipset or Intel HD Graphics 4200 and their standard driver interface plus Windows 7 to work with and didn't need to do anything unusual to get my TV and the GPU on either PC working together at that resolution.Ignore list: hello_hello, tried, TechLord, Snoopy329 -
@herder Reviewing your initial post, it is not clear whether you have tried setting a custom resolution or have only made selections from the list of resolutions provided by Windows. If you haven't tried setting a custom resolution, here are instructions for a PC running Windows 7. https://www.techwalla.com/articles/how-to-have-custom-screen-resolution-on-windows-7
Ignore list: hello_hello, tried, TechLord, Snoopy329 -
HDMI can limitation as some video modes are mandatory (they must be supported) and rest is optional. HDMI mandatory mode is 640x480p60 and one of basic SD TV modes (however progressive not interlaced) thus as substitute for 640x480 a 720x480p60 or 720x576p50 can be used - that's all, one of HD modes should be supported too (so 12870x720p60/p50 or 1920x1080i30/i25) - HDMI sink is allowed to NOT support anything else than 2 video modes. It may ignore HDMI source capability and request (preferred) one of those two video modes.
You can add custom resolution you may rely on driver capabilities but without knowing how EDID provided by HDMI sink looks you can't do too much.
I agree, driver can limit HDMI, sink EDID can limit HDMI - HDMI provide 3 different way to select used resolution. -
Hi
Thanks for input folks.
I did a quick search on 'Video ID Code' & 'Detailed Timing Descriptor', but this all seems too technical. It would take weeks just to understand the backstory on the jargon used in the articles I found.
The veriton is NOT very powerful ( only use it for Netflix) but is much better since I changed the graphics control panel power setting to 'max performance'. - CPU now down to 60% from 99% previously. Ram is still at about 40% so big enough - faster ram may help.
So for now, I am back to the original question - "should HDMI produce better monitor picture than the old-school VGA / RGB blue cable (with 'D' shaped spigot on ends).
which I am back to using since the HDMI experiment failed ?? part of the problem with HDMI cable was that the desktop was pushed over the edges of the screen, I guess this was the resolution over-filling the physical screen?
I do have possibilities to set custom res. There are options in windows (with warnings that meddling may cause system to detonate etc) - (thanks for link to article)
Also the graphics control panel (up from taskbar) seems to give other options for custom res.
But if the HDMI would not be expected to produce SIGNIFICANTLY better results than the blue-end RGB / VGA cable, then all the mucking about would just be a waste of effort. ?
Steve -
-
I agree with jagabo regarding the picture. Having used both HDMI and VGA to connect one of my computers to my TV at the TV's native resolution, I found VGA produces a slightly softer picture. The other advantage is that HDMI can also supply audio for the TV speakers if the computer's audio card/drivers support it.
Last edited by usually_quiet; 17th Aug 2019 at 21:30.
Ignore list: hello_hello, tried, TechLord, Snoopy329 -
I have a Mac Mini where HDMI adversely affects system color. It's fine via DVI or VGA. The HDMI is the problem. Not the wire, maybe not the port, but something internal.
HDMI stole your sanity?
Well, for several weeks about a year ago, I was there myself.Want my help? Ask here! (not via PM!)
FAQs: Best Blank Discs • Best TBCs • Best VCRs for capture • Restore VHS -
Contradictory to some definitely statements above, i would say: HDMI provide to average consumer easy way to connect equipment (HD Video + HD Audio) but not necessary providing higher subjective quality than analog component connection. There is many factors involved in this and HDMI should deliver objectively better quality and usually it will deliver objectively better quality but at some cases perceived quality can be worse.
-
-
-
Another thing, I did not pay huge money for this HDMI cable.
Then saw ad saying "super high speed HDMI" at prices 3 times what I paid. - are there different grades / speed ratings for HDMI cables ? or was that just sales hype ?
would a faster / expensive cable give better performance ? -
There are differences in HDMI cables but at the low resolutions you're looking at any working HDMI cable will do.
https://en.wikipedia.org/wiki/HDMI#Versions -
pandy was telling you that you have to decide for yourself.
If everything is set up correctly and working properly, HDMI can deliver a better picture when evaluated objectively (based on facts/calculations/measurements/unbiased observation). Subjectively (per individual preference), some people still like a softer VGA picture better, possibly because they have come to expect a good picture to look that way. If you asked my elderly mother, who has poor eyesight and hearing, which she prefers, she'd say HDMI because she likes having fewer "wires" connecting devices to the TV, plus CEC works for her DVD player.
However, unless you were able to get HDMI to work using a custom resolution setting, it appears that you don't have HDMI as an option with your current gear.Ignore list: hello_hello, tried, TechLord, Snoopy329 -
Thanks for your contributions,
forgot to mention: it was suggested to consult manual, duly downloaded manual (thanks Samsung) apparently it's not a monitor but actually a TV. Further, HDMI needs to go in the other port, and I CAN configure for custom res.
PC is much better since I enlarged pagefile space massively, and increased BIOS memory cache. Also turning the Win7 Aero features off helped.
So virtually no buffering now. I've reverted to my own HTML homepage for frequently used links (homepage is a .htm file living on my HDD). This allowed me to remove open tabs from the browser reducing contention for bandwidth too.
Thanks Jagabo: I will check versions / dates or my hardware to see if HDMI2 cable is supported. - some of the OLED screens I see in Costco are VERY crisp, I would probably prefer a sharp image than fuzzy. In any case, I am a lot further forward in enabling a decision / comparison
So thanks allLast edited by herder; 21st Aug 2019 at 13:13.
-
Make sure your disable the overscan feature of your TV too. Otherwise you won't get pixel-for-pixel mapping even if you set the HDMI resolution to the TV's native resolution (the TV will zoom in and crop off the edges).
-
pandy was telling that analog signal with natural noise present in analog cable, resampled by Analog to Digital Converter in your display may for example offer reduced banding effect when compared with perfect HDMI... also some adjustment may be done in analog domain thus keeping quantization range intact (not possible in digital domain without additional processing).
But all depends on personal preferences. -
And the digital-to-analog-to-digital conversion using an analog cable may result in increased (or decreased) contrast/saturation which may be interpreted as better image quality, even though it is less accurate.
Similar Threads
-
HDMI to S-Video cable or SCART to S-Video cable.Which is better for elgato?
By VideoFanatic in forum CapturingReplies: 22Last Post: 23rd Jan 2023, 13:40 -
HDMI to Coaxial ( cable) cable adapter
By zmann96 in forum Newbie / General discussionsReplies: 2Last Post: 19th Feb 2019, 15:21 -
HDMI Cable V
By jyeh74 in forum Newbie / General discussionsReplies: 12Last Post: 29th Apr 2015, 08:13 -
No audio with HDMI cable
By jyeh74 in forum ComputerReplies: 10Last Post: 12th Jan 2015, 23:49 -
Is it possible to run an HDMI cable over a distance of 30 feet?
By True Colors in forum Newbie / General discussionsReplies: 4Last Post: 28th Sep 2014, 06:33