VideoHelp Forum
+ Reply to Thread
Results 1 to 22 of 22
Thread
  1. Member
    Join Date
    Mar 2014
    Location
    England
    Search Comp PM
    HI

    I hope this is the correct category - there are so many !

    My Samsung monitor (which I use as a TV) was driven by a Win7 laptop via blue RGB monitor cable - (resolution 1360x768).
    Than I swapped the laptop out and installed a Asus Veriton N260g - much tidier & more elegant solution I thought.

    Aha, the Veriton has HDMI says I - will be better picture and even less wires dangling out of the back etc etc.

    Duly bought HDMI cable, but it sent resolution haywire and picture was much more blocky. 1360x768 res was no longer available.

    It set itself to another res saying this was the native res of the monitor. Any other res options I tried, resulted in a small desktop picture in the middle of a large screen. none of which were satisfactory

    When forced to revert to original RGB cable, everything was back to normal without input from me. Native (recommended) resolution was back at 1360x768

    Just one of the many thing that puzzled me is - how can a monitor be perceived as having different 'native' resolutions ?

    Surely HDMI should be a better method than old-school RGB cable ?

    Can anyone shed light / offer solution / advice ?

    Steve
    Quote Quote  
  2. Originally Posted by herder View Post

    Can anyone shed light / offer solution / advice ?
    1360x768 is not standard HDMI resolution, try to force your HDMI link to DVI (you will loose audio) but you gain possibility to use native resolution, alternatively try to force your notebook graphic card to use non standard HDMI resolution (i assume currently you can select either 1280x720 or 1920x1080). both proposals assume your source must be able to be configured - this depend highly on your graphic card, each graphics vendor is different.
    Quote Quote  
  3. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by herder View Post
    HI

    I hope this is the correct category - there are so many !

    My Samsung monitor (which I use as a TV) was driven by a Win7 laptop via blue RGB monitor cable - (resolution 1360x768).
    Than I swapped the laptop out and installed a Asus Veriton N260g - much tidier & more elegant solution I thought.

    Aha, the Veriton has HDMI says I - will be better picture and even less wires dangling out of the back etc etc.

    Duly bought HDMI cable, but it sent resolution haywire and picture was much more blocky. 1360x768 res was no longer available.

    It set itself to another res saying this was the native res of the monitor. Any other res options I tried, resulted in a small desktop picture in the middle of a large screen. none of which were satisfactory

    When forced to revert to original RGB cable, everything was back to normal without input from me. Native (recommended) resolution was back at 1360x768

    Just one of the many thing that puzzled me is - how can a monitor be perceived as having different 'native' resolutions ?

    Surely HDMI should be a better method than old-school RGB cable ?

    Can anyone shed light / offer solution / advice ?

    Steve
    HDMI itself readily supports 1360x768 output. I have had 2 different Windows PCs connected to a 2011 720p TV with 1360x768 native resolution using HDMI at the TV's native resolution. One PC dates from 2009, and the other dates from 2014.

    However, only resolutions supported by both the GPU and the display will be available to select as output. It is possible that the Veriton N260g doesn't support 1360x768 output via HDMI or that your monitor supports 1360x768 resolution input for VGA but not for HDMI. Check the monitor's manual to see which resolutions are supported via HDMI.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  4. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    I did some research on the "Asus" Veriton N260g. First, it turns out that "Veriton" is an Acer product line, not an Asus product line. The Veriton N260g was released in 2009 and has an Intel Atom N280 / 1.66 GHz CPU with graphics provided by an Intel GN40 chipset.

    The GN40 chipset seems inadequate for playing back some types of video that are common in 2019. For example, apparently it is OK for decoding 720p video but not great at decoding 1080p Blu-ray. Maybe you need something a little newer to use as an HTPC.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  5. Originally Posted by usually_quiet View Post
    HDMI itself readily supports 1360x768 output.
    Yes but not as 'Video ID Code' - only trough 'Detailed Timing Descriptor'
    Quote Quote  
  6. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by pandy View Post
    Originally Posted by usually_quiet View Post
    HDMI itself readily supports 1360x768 output.
    Yes but not as 'Video ID Code' - only trough 'Detailed Timing Descriptor'
    Regardless, using HDMI still isn't exclusively the cause of the problem. Plus, given that the graphics are provided by an Intel GN40 chipset, I'm somewhat doubtful that much in the way of advanced settings are available. Also, given the form-factor of the computer, there is no way to install a VGA card and upgrade the graphics.

    For what it is worth, I only had AMD's 785G integrated graphics chipset or Intel HD Graphics 4200 and their standard driver interface plus Windows 7 to work with and didn't need to do anything unusual to get my TV and the GPU on either PC working together at that resolution.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  7. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    @herder Reviewing your initial post, it is not clear whether you have tried setting a custom resolution or have only made selections from the list of resolutions provided by Windows. If you haven't tried setting a custom resolution, here are instructions for a PC running Windows 7. https://www.techwalla.com/articles/how-to-have-custom-screen-resolution-on-windows-7
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  8. Originally Posted by usually_quiet View Post
    Regardless, using HDMI still isn't exclusively the cause of the problem. Plus, given that the graphics are provided by an Intel GN40 chipset, I'm somewhat doubtful that much in the way of advanced settings are available. Also, given the form-factor of the computer, there is no way to install a VGA card and upgrade the graphics.

    For what it is worth, I only had AMD's 785G integrated graphics chipset or Intel HD Graphics 4200 and their standard driver interface plus Windows 7 to work with and didn't need to do anything unusual to get my TV and the GPU on either PC working together at that resolution.
    HDMI can limitation as some video modes are mandatory (they must be supported) and rest is optional. HDMI mandatory mode is 640x480p60 and one of basic SD TV modes (however progressive not interlaced) thus as substitute for 640x480 a 720x480p60 or 720x576p50 can be used - that's all, one of HD modes should be supported too (so 12870x720p60/p50 or 1920x1080i30/i25) - HDMI sink is allowed to NOT support anything else than 2 video modes. It may ignore HDMI source capability and request (preferred) one of those two video modes.
    You can add custom resolution you may rely on driver capabilities but without knowing how EDID provided by HDMI sink looks you can't do too much.
    I agree, driver can limit HDMI, sink EDID can limit HDMI - HDMI provide 3 different way to select used resolution.
    Quote Quote  
  9. Member
    Join Date
    Mar 2014
    Location
    England
    Search Comp PM
    Hi

    Thanks for input folks.

    I did a quick search on 'Video ID Code' & 'Detailed Timing Descriptor', but this all seems too technical. It would take weeks just to understand the backstory on the jargon used in the articles I found.

    The veriton is NOT very powerful ( only use it for Netflix) but is much better since I changed the graphics control panel power setting to 'max performance'. - CPU now down to 60% from 99% previously. Ram is still at about 40% so big enough - faster ram may help.

    So for now, I am back to the original question - "should HDMI produce better monitor picture than the old-school VGA / RGB blue cable (with 'D' shaped spigot on ends).
    which I am back to using since the HDMI experiment failed ?? part of the problem with HDMI cable was that the desktop was pushed over the edges of the screen, I guess this was the resolution over-filling the physical screen?

    I do have possibilities to set custom res. There are options in windows (with warnings that meddling may cause system to detonate etc) - (thanks for link to article)

    Also the graphics control panel (up from taskbar) seems to give other options for custom res.

    But if the HDMI would not be expected to produce SIGNIFICANTLY better results than the blue-end RGB / VGA cable, then all the mucking about would just be a waste of effort. ?

    Steve
    Quote Quote  
  10. Originally Posted by herder View Post
    So for now, I am back to the original question - "should HDMI produce better monitor picture than the old-school VGA
    When set up properly on a fixed pixel monitor (Ie LCD, Plasma, OLED, etc.), yes.
    Quote Quote  
  11. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    I agree with jagabo regarding the picture. Having used both HDMI and VGA to connect one of my computers to my TV at the TV's native resolution, I found VGA produces a slightly softer picture. The other advantage is that HDMI can also supply audio for the TV speakers if the computer's audio card/drivers support it.
    Last edited by usually_quiet; 17th Aug 2019 at 22:30.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  12. Video Restorer lordsmurf's Avatar
    Join Date
    Jun 2003
    Location
    dFAQ.us/lordsmurf
    Search Comp PM
    I have a Mac Mini where HDMI adversely affects system color. It's fine via DVI or VGA. The HDMI is the problem. Not the wire, maybe not the port, but something internal.

    HDMI stole your sanity?
    Well, for several weeks about a year ago, I was there myself.
    Want my help? Ask here! (not via PM!)
    FAQs: Best Blank DiscsBest TBCsBest VCRs for captureRestore VHS
    Quote Quote  
  13. Originally Posted by herder View Post
    So for now, I am back to the original question - "should HDMI produce better monitor picture than the old-school VGA / RGB blue cable (with 'D' shaped spigot on ends).

    ...snip...

    But if the HDMI would not be expected to produce SIGNIFICANTLY better results than the blue-end RGB / VGA cable, then all the mucking about would just be a waste of effort. ?

    Steve
    Contradictory to some definitely statements above, i would say: HDMI provide to average consumer easy way to connect equipment (HD Video + HD Audio) but not necessary providing higher subjective quality than analog component connection. There is many factors involved in this and HDMI should deliver objectively better quality and usually it will deliver objectively better quality but at some cases perceived quality can be worse.
    Quote Quote  
  14. Member
    Join Date
    Mar 2014
    Location
    England
    Search Comp PM
    [QUOTE=pandy;2557882]
    Originally Posted by herder View Post
    HDMI provide to average consumer easy way to connect equipment (HD Video + HD Audio).
    So in your opinion what combination of my options would provide the BEST perceived quality ?
    Quote Quote  
  15. Member
    Join Date
    Mar 2014
    Location
    England
    Search Comp PM
    Originally Posted by lordsmurf View Post
    for several weeks about a year ago, I was there myself.
    Yes I remember (but more than a year ago methinks) your health was not the best ?
    Glad your back now !

    Steve
    Quote Quote  
  16. Member
    Join Date
    Mar 2014
    Location
    England
    Search Comp PM
    Another thing, I did not pay huge money for this HDMI cable.

    Then saw ad saying "super high speed HDMI" at prices 3 times what I paid. - are there different grades / speed ratings for HDMI cables ? or was that just sales hype ?

    would a faster / expensive cable give better performance ?
    Quote Quote  
  17. There are differences in HDMI cables but at the low resolutions you're looking at any working HDMI cable will do.

    https://en.wikipedia.org/wiki/HDMI#Versions
    Quote Quote  
  18. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by herder View Post
    So in your opinion what combination of my options would provide the BEST perceived quality ?
    pandy was telling you that you have to decide for yourself.

    If everything is set up correctly and working properly, HDMI can deliver a better picture when evaluated objectively (based on facts/calculations/measurements/unbiased observation). Subjectively (per individual preference), some people still like a softer VGA picture better, possibly because they have come to expect a good picture to look that way. If you asked my elderly mother, who has poor eyesight and hearing, which she prefers, she'd say HDMI because she likes having fewer "wires" connecting devices to the TV, plus CEC works for her DVD player.

    However, unless you were able to get HDMI to work using a custom resolution setting, it appears that you don't have HDMI as an option with your current gear.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  19. Member
    Join Date
    Mar 2014
    Location
    England
    Search Comp PM
    Originally Posted by usually_quiet View Post

    However, unless you were able to get HDMI to work using a custom resolution setting, it appears that you don't have HDMI as an option with your current gear.
    Thanks for your contributions,

    forgot to mention: it was suggested to consult manual, duly downloaded manual (thanks Samsung) apparently it's not a monitor but actually a TV. Further, HDMI needs to go in the other port, and I CAN configure for custom res.

    PC is much better since I enlarged pagefile space massively, and increased BIOS memory cache. Also turning the Win7 Aero features off helped.

    So virtually no buffering now. I've reverted to my own HTML homepage for frequently used links (homepage is a .htm file living on my HDD). This allowed me to remove open tabs from the browser reducing contention for bandwidth too.


    Thanks Jagabo: I will check versions / dates or my hardware to see if HDMI2 cable is supported. - some of the OLED screens I see in Costco are VERY crisp, I would probably prefer a sharp image than fuzzy. In any case, I am a lot further forward in enabling a decision / comparison

    So thanks all
    Last edited by herder; 21st Aug 2019 at 14:13.
    Quote Quote  
  20. Make sure your disable the overscan feature of your TV too. Otherwise you won't get pixel-for-pixel mapping even if you set the HDMI resolution to the TV's native resolution (the TV will zoom in and crop off the edges).
    Quote Quote  
  21. Originally Posted by usually_quiet View Post
    Originally Posted by herder View Post
    So in your opinion what combination of my options would provide the BEST perceived quality ?
    pandy was telling you that you have to decide for yourself.

    If everything is set up correctly and working properly, HDMI can deliver a better picture when evaluated objectively (based on facts/calculations/measurements/unbiased observation). Subjectively (per individual preference), some people still like a softer VGA picture better, possibly because they have come to expect a good picture to look that way. If you asked my elderly mother, who has poor eyesight and hearing, which she prefers, she'd say HDMI because she likes having fewer "wires" connecting devices to the TV, plus CEC works for her DVD player.

    However, unless you were able to get HDMI to work using a custom resolution setting, it appears that you don't have HDMI as an option with your current gear.
    pandy was telling that analog signal with natural noise present in analog cable, resampled by Analog to Digital Converter in your display may for example offer reduced banding effect when compared with perfect HDMI... also some adjustment may be done in analog domain thus keeping quantization range intact (not possible in digital domain without additional processing).
    But all depends on personal preferences.
    Quote Quote  
  22. And the digital-to-analog-to-digital conversion using an analog cable may result in increased (or decreased) contrast/saturation which may be interpreted as better image quality, even though it is less accurate.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!