VideoHelp Forum

Our website is made possible by displaying online advertisements to our visitors. Consider supporting us by disable your adblocker or Try ConvertXtoDVD and convert all your movies to DVD. Free trial ! :)
+ Reply to Thread
Results 1 to 22 of 22
Thread

Threaded View

  1. Member
    Join Date
    Mar 2014
    Location
    England
    Search Comp PM
    HI

    I hope this is the correct category - there are so many !

    My Samsung monitor (which I use as a TV) was driven by a Win7 laptop via blue RGB monitor cable - (resolution 1360x768).
    Than I swapped the laptop out and installed a Asus Veriton N260g - much tidier & more elegant solution I thought.

    Aha, the Veriton has HDMI says I - will be better picture and even less wires dangling out of the back etc etc.

    Duly bought HDMI cable, but it sent resolution haywire and picture was much more blocky. 1360x768 res was no longer available.

    It set itself to another res saying this was the native res of the monitor. Any other res options I tried, resulted in a small desktop picture in the middle of a large screen. none of which were satisfactory

    When forced to revert to original RGB cable, everything was back to normal without input from me. Native (recommended) resolution was back at 1360x768

    Just one of the many thing that puzzled me is - how can a monitor be perceived as having different 'native' resolutions ?

    Surely HDMI should be a better method than old-school RGB cable ?

    Can anyone shed light / offer solution / advice ?

    Steve
    Quote Quote  
  2. Originally Posted by herder View Post

    Can anyone shed light / offer solution / advice ?
    1360x768 is not standard HDMI resolution, try to force your HDMI link to DVI (you will loose audio) but you gain possibility to use native resolution, alternatively try to force your notebook graphic card to use non standard HDMI resolution (i assume currently you can select either 1280x720 or 1920x1080). both proposals assume your source must be able to be configured - this depend highly on your graphic card, each graphics vendor is different.
    Quote Quote  
  3. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by herder View Post
    HI

    I hope this is the correct category - there are so many !

    My Samsung monitor (which I use as a TV) was driven by a Win7 laptop via blue RGB monitor cable - (resolution 1360x768).
    Than I swapped the laptop out and installed a Asus Veriton N260g - much tidier & more elegant solution I thought.

    Aha, the Veriton has HDMI says I - will be better picture and even less wires dangling out of the back etc etc.

    Duly bought HDMI cable, but it sent resolution haywire and picture was much more blocky. 1360x768 res was no longer available.

    It set itself to another res saying this was the native res of the monitor. Any other res options I tried, resulted in a small desktop picture in the middle of a large screen. none of which were satisfactory

    When forced to revert to original RGB cable, everything was back to normal without input from me. Native (recommended) resolution was back at 1360x768

    Just one of the many thing that puzzled me is - how can a monitor be perceived as having different 'native' resolutions ?

    Surely HDMI should be a better method than old-school RGB cable ?

    Can anyone shed light / offer solution / advice ?

    Steve
    HDMI itself readily supports 1360x768 output. I have had 2 different Windows PCs connected to a 2011 720p TV with 1360x768 native resolution using HDMI at the TV's native resolution. One PC dates from 2009, and the other dates from 2014.

    However, only resolutions supported by both the GPU and the display will be available to select as output. It is possible that the Veriton N260g doesn't support 1360x768 output via HDMI or that your monitor supports 1360x768 resolution input for VGA but not for HDMI. Check the monitor's manual to see which resolutions are supported via HDMI.
    Ignore list: hello_hello, tried, TechLord
    Quote Quote  
  4. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    I did some research on the "Asus" Veriton N260g. First, it turns out that "Veriton" is an Acer product line, not an Asus product line. The Veriton N260g was released in 2009 and has an Intel Atom N280 / 1.66 GHz CPU with graphics provided by an Intel GN40 chipset.

    The GN40 chipset seems inadequate for playing back some types of video that are common in 2019. For example, apparently it is OK for decoding 720p video but not great at decoding 1080p Blu-ray. Maybe you need something a little newer to use as an HTPC.
    Ignore list: hello_hello, tried, TechLord
    Quote Quote  
  5. Originally Posted by usually_quiet View Post
    HDMI itself readily supports 1360x768 output.
    Yes but not as 'Video ID Code' - only trough 'Detailed Timing Descriptor'
    Quote Quote  
  6. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by pandy View Post
    Originally Posted by usually_quiet View Post
    HDMI itself readily supports 1360x768 output.
    Yes but not as 'Video ID Code' - only trough 'Detailed Timing Descriptor'
    Regardless, using HDMI still isn't exclusively the cause of the problem. Plus, given that the graphics are provided by an Intel GN40 chipset, I'm somewhat doubtful that much in the way of advanced settings are available. Also, given the form-factor of the computer, there is no way to install a VGA card and upgrade the graphics.

    For what it is worth, I only had AMD's 785G integrated graphics chipset or Intel HD Graphics 4200 and their standard driver interface plus Windows 7 to work with and didn't need to do anything unusual to get my TV and the GPU on either PC working together at that resolution.
    Ignore list: hello_hello, tried, TechLord
    Quote Quote  
  7. Originally Posted by usually_quiet View Post
    Regardless, using HDMI still isn't exclusively the cause of the problem. Plus, given that the graphics are provided by an Intel GN40 chipset, I'm somewhat doubtful that much in the way of advanced settings are available. Also, given the form-factor of the computer, there is no way to install a VGA card and upgrade the graphics.

    For what it is worth, I only had AMD's 785G integrated graphics chipset or Intel HD Graphics 4200 and their standard driver interface plus Windows 7 to work with and didn't need to do anything unusual to get my TV and the GPU on either PC working together at that resolution.
    HDMI can limitation as some video modes are mandatory (they must be supported) and rest is optional. HDMI mandatory mode is 640x480p60 and one of basic SD TV modes (however progressive not interlaced) thus as substitute for 640x480 a 720x480p60 or 720x576p50 can be used - that's all, one of HD modes should be supported too (so 12870x720p60/p50 or 1920x1080i30/i25) - HDMI sink is allowed to NOT support anything else than 2 video modes. It may ignore HDMI source capability and request (preferred) one of those two video modes.
    You can add custom resolution you may rely on driver capabilities but without knowing how EDID provided by HDMI sink looks you can't do too much.
    I agree, driver can limit HDMI, sink EDID can limit HDMI - HDMI provide 3 different way to select used resolution.
    Quote Quote  
  8. Member
    Join Date
    Mar 2014
    Location
    England
    Search Comp PM
    Hi

    Thanks for input folks.

    I did a quick search on 'Video ID Code' & 'Detailed Timing Descriptor', but this all seems too technical. It would take weeks just to understand the backstory on the jargon used in the articles I found.

    The veriton is NOT very powerful ( only use it for Netflix) but is much better since I changed the graphics control panel power setting to 'max performance'. - CPU now down to 60% from 99% previously. Ram is still at about 40% so big enough - faster ram may help.

    So for now, I am back to the original question - "should HDMI produce better monitor picture than the old-school VGA / RGB blue cable (with 'D' shaped spigot on ends).
    which I am back to using since the HDMI experiment failed ?? part of the problem with HDMI cable was that the desktop was pushed over the edges of the screen, I guess this was the resolution over-filling the physical screen?

    I do have possibilities to set custom res. There are options in windows (with warnings that meddling may cause system to detonate etc) - (thanks for link to article)

    Also the graphics control panel (up from taskbar) seems to give other options for custom res.

    But if the HDMI would not be expected to produce SIGNIFICANTLY better results than the blue-end RGB / VGA cable, then all the mucking about would just be a waste of effort. ?

    Steve
    Quote Quote  
  9. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    @herder Reviewing your initial post, it is not clear whether you have tried setting a custom resolution or have only made selections from the list of resolutions provided by Windows. If you haven't tried setting a custom resolution, here are instructions for a PC running Windows 7. https://www.techwalla.com/articles/how-to-have-custom-screen-resolution-on-windows-7
    Ignore list: hello_hello, tried, TechLord
    Quote Quote  
  10. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    I agree with jagabo regarding the picture. Having used both HDMI and VGA to connect one of my computers to my TV at the TV's native resolution, I found VGA produces a slightly softer picture. The other advantage is that HDMI can also supply audio for the TV speakers if the computer's audio card/drivers support it.
    Last edited by usually_quiet; 17th Aug 2019 at 21:30.
    Ignore list: hello_hello, tried, TechLord
    Quote Quote  
  11. Video Restorer lordsmurf's Avatar
    Join Date
    Jun 2003
    Location
    dFAQ.us/lordsmurf
    Search Comp PM
    I have a Mac Mini where HDMI adversely affects system color. It's fine via DVI or VGA. The HDMI is the problem. Not the wire, maybe not the port, but something internal.

    HDMI stole your sanity?
    Well, for several weeks about a year ago, I was there myself.
    Quote Quote  
  12. Member
    Join Date
    Mar 2014
    Location
    England
    Search Comp PM
    Originally Posted by lordsmurf View Post
    for several weeks about a year ago, I was there myself.
    Yes I remember (but more than a year ago methinks) your health was not the best ?
    Glad your back now !

    Steve
    Quote Quote  
  13. Member
    Join Date
    Mar 2014
    Location
    England
    Search Comp PM
    Another thing, I did not pay huge money for this HDMI cable.

    Then saw ad saying "super high speed HDMI" at prices 3 times what I paid. - are there different grades / speed ratings for HDMI cables ? or was that just sales hype ?

    would a faster / expensive cable give better performance ?
    Quote Quote  
  14. There are differences in HDMI cables but at the low resolutions you're looking at any working HDMI cable will do.

    https://en.wikipedia.org/wiki/HDMI#Versions
    Quote Quote  
  15. Make sure your disable the overscan feature of your TV too. Otherwise you won't get pixel-for-pixel mapping even if you set the HDMI resolution to the TV's native resolution (the TV will zoom in and crop off the edges).
    Quote Quote  
  16. And the digital-to-analog-to-digital conversion using an analog cable may result in increased (or decreased) contrast/saturation which may be interpreted as better image quality, even though it is less accurate.
    Quote Quote  



Similar Threads