VideoHelp Forum




+ Reply to Thread
Results 1 to 28 of 28
  1. Hi, wasnt sure where else to put this

    I have an nvidia 9800gt that i am trying to connect to a samsung led tv via dvi to hdmi cable. in my nvidia control panel it does NOT let me select what type of connector i am using, and has vga listed for both my tv, and the monitor i am using currently (an old lcd using a dvi to vga adapter). it may be as simple as finding drivers that lets me change this, but i have used the most recent, version 266.58 and ones from 2010 (when this was my main computer) and none would allow that.

    the tv will work fine during booting, even loading windows, however as soon as it should go to the user account screen, the tv will lose signal. it also works fully when booted in safe mode, although there i believe it will not let me use full 1366x768 resolution.

    apologies if i left out any info, i'll be happy to add more. thanks
    Quote Quote  
  2. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    Just a thought.

    Are you using the correct adapter cable ? There are various types of dvi. -A = Anaolgue. -D = Digital.

    Seems to me that the tv sees an analogue signal rather than a digital one.
    Quote Quote  
  3. it is a dvi d single link to hdmi

    i just had to restart mid driver update (been trying any different version) tv worked fine, but res was capped at 1280x800 without vid card drivers and 2nd monitor wont work. once installation finished and i rsed again, the 2nd monitor popped up at the user screen and tv went off again, so it seems driver/tv related and not a dead port/cable/card.

    after the driver was installed and the tv was not working again, i tried putting it in the same 1280x8 res and still didnt work. i should note that the computer recgonizes another display as i can drag my mouse off the screen
    Quote Quote  
  4. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    If I now read you correctly, you are having driver issues. These have to be solved BEFORE anything else.

    Now I do wonder if you are confusing the display specs of the tv under VGA with its hd capabilities - 1366*768 is not part of the hd spec AFAIK.

    HD is 1280*720 or 1920*1080. So I would have thought that your card has to output one of these to your tv.

    Also check your tv's manual. There may be a setting under hdmi specifically for a PC or even a dedicated hdmi socket for a PC.

    But do sort those drivers out first as nothing will work correctly without those.

    Also read the forum as there are many topics about connecting tv's to PC's
    Quote Quote  
  5. tv supports both 1280x720 and 1366x768. it would run bios screen etc in 1366x768 and is also shown to support that in the manual. only other thing the manual says is to use the 1st hdmi port which is hdmi/dvi, which i have already been using.

    no luck so far with the drivers, i wonder if the problem is with a specific version of drivers or something more. i've tried 6 versions atm spanning 2 years... i would really prefer to not buy another monitor
    Quote Quote  
  6. also, that did not work, although the tv works fine without drivers in 1280x720, and will still boot with drivers at 1366x768 til the user login screen then loses signal again
    Quote Quote  
  7. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    Like I said earlier, 1366*768 is NOT part of the HD spec. I assume your tv hs aa standard D-pin vga connector. Plug that in and then see if you get a picture at 1366*768.
    Quote Quote  
  8. My video card's an Nvidia 8600GT and my TV's a Samsung. It seems the Nvidia cards still work the same way (my card has 1xVGA output and 1xDVI/VGA output). If there's a monitor connected to both outputs on the video card, the DVI output becomes the default monitor. When the PC boots it'll display the BIOS stuff, which means it displays on the TV instead of the PC monitor. It's a little annoying as the TV needs to be on to see the BIOS messages. If your card has dual DVI outputs, try swapping them around.

    Normally the TV is monitor 2 and the PC monitor is monitor 1. In order to put the desktop on the PC monitor I need to tell Windows it's the primary monitor and enable the secondary monitor (TV) manually. That setup works for me. The BIOS displays on the TV along with the Windows boot screen, then when Windows has loaded it effectively swaps the displays around and switches the desktop to the PC monitor and the TV becomes monitor 2.

    As I said though, if you have dual DVI outputs you mightn't need to do any of that (I only have one DVI output so I can't swap them). Connecting the PC monitor to the first output on the card should get the BIOS messages to display on the PC monitor and it'll be the default display in Windows.

    Normally I have this PC connected to my TV via a DVI-HDMI cable but I moved some stuff around yesterday and used a VGA cable instead. I'll have to shut down and reboot to use HDMI again (which I'll do shortly) so once I've done that I'll report back regarding whether Windows sees the TV as VGA or HDMI. Regardless of what Windows may say though, if the PC is connected to the TV via HDMI it can't be anything but digital. I suspect DB83 is correct and the TV doesn't support 1366*768 via HDMI, so it's shutting itself off when you try to connect at that resolution. When I swap the VGA cable for a HDMI cable, I'll check to see if 1366*768 is still listed as a supported resolution.
    Last edited by hello_hello; 17th May 2013 at 06:05.
    Quote Quote  
  9. I didn't mess around too much, but it seems the video card connects to the TV using the TV's native resolution and default refresh rate while the PC is booting (1920x1080 @ 60Hz in my case). I guess that's between the video card and the TV. Once Windows has loaded though, it'll switch the TV to whatever resolution/refresh rate you've specified in Display Properties.

    1360*768 is a resolution supported by my TV over HDMI (you wrote 1366*768 tloft, was that a typo?). I tried it and the TV happily connected at 1360*768 and the Nvidia Control Panel sees the TV has a HDMI/HDTV connection.

    I think you need to start by swapping the outputs on the video card if you can. Even if you don't though, once you have Windows set to use the PC monitor as the primary monitor, and the TV as the secondary monitor, and they're both working at the resolution/refresh rate you prefer, Windows should return them to the correct state each time you boot. If the TV keeps shutting down when Windows loads.... well if you're using a supported resolution/refresh rate for the TV and the secondary monitor is enabled in Windows I can't think of a reason why it would.
    Last edited by hello_hello; 17th May 2013 at 06:03.
    Quote Quote  
  10. no, it was not a typo, 1366x768.

    my card is 2 dvi connectors, my tv is 2 hdmi, NO vga or dvi connectors on the tv. only other is component, but i dont have them to try out.

    ive tried in both dvi ports, and had both set as primary different times and still no such luck. 1366x768 is what it loads bios in, what the computer recognizes as its native res, and is listed as supported in the tvs manual. even if i try it at 1280x720 it doesnt change anything when the drivers are running so i dont think its something that simple.

    what version drivers do you use?
    Quote Quote  
  11. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    It may help us to help you if you mentioned the model number of this tv.

    Odd not to have a d-pin socket on it.
    Quote Quote  
  12. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by DB83 View Post
    It may help us to help you if you mentioned the model number of this tv.

    Odd not to have a d-pin socket on it.
    I recently shopped for a new TV that could also be used as a monitor. D-Sub connections and the associated PC audio connection are now often missing from new smallish and less expensive TVs, even for those in mainstream brands like Samsung.
    Quote Quote  
  13. http://www.samsung.com/us/video/tvs/UN32D4003BDXZA-specs

    Video
    Screen Size 31.5" Measured Diagonally
    Native Resolution 1,366 x 768
    Resolution 720p
    Clear Motion Rate 60
    Inputs & Outputs
    HDMI 2
    USB 2.0 1
    Component 1
    Composite (AV) 1
    Quote Quote  
  14. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    And the manual clearly states that optimum resolution is 1360 * 768. In fact there is no 1366 * 768 resolution quoted.

    I believe the latter is merely to provide the 720p support.
    Quote Quote  
  15. i quoted the above directly from the samsung site, i dont see the hang up on 1360x768 vs 1366x768, but yes. both have that listed as a supported resolution.

    afaik the tv would NEVER show picture in an unsupported resolution, yet it will show 1366x768@60hz when it starts

    i really seems like a driver issue, but i dont know what else to try. i've done all suggested, used a driver sweeper program as well, nothing changes for that section. still open to try anything else. prob wait another month before i just buy a dvi computer monitor, but obv would rather not have to do that
    Quote Quote  
  16. Originally Posted by tloft View Post
    what version drivers do you use?
    Version 6.14.12.9573 according to Device Manager. I'm not sure what that equates to in GForce Driver version numbers but they're dated Feb 2012. I try not to upgrade stuff unless I need to. Obviously I updated those drivers over a year ago, but I've stuck with that version. Here's something to look out for.... if you happen to use the video card (Nvidia Control Panel) to expand video to PC levels, the newer drivers kindly forget the setting between reboots for one of the monitors. In fact the latest drivers seem to forget the setting for both monitors between reboots. At least for me.

    I have the TV input set to expect PC levels rather than TV levels as then Windows itself displays correctly and the TV and monitor are using the same levels. The PC/HDMI input defaults to TV levels on my TV. The input level isn't named particularly intuitively. Being a TV, you'd expect the "normal" setting to be TV levels, but it's actually PC levels. The default "Low" is TV levels. Just thought I'd throw that in for information.....

    Originally Posted by tloft View Post
    i quoted the above directly from the samsung site, i dont see the hang up on 1360x768 vs 1366x768, but yes. both have that listed as a supported resolution.
    I did a "1366" search in the manual and found no references. "Optimal resolution is 1360x768@60Hz" according to the manual I looked at (page 9). Are you positive the video card is initially connecting at 1366x768 before Windows loads? It seems like an odd resolution. And what about after Windows loads. Will the TV display 1366x768?

    Originally Posted by tloft View Post
    afaik the tv would NEVER show picture in an unsupported resolution, yet it will show 1366x768@60hz when it starts
    My TV at least tries. For instance 1600x900 is listed as a supported resolution in Windows, but if I select it the TV over-rides it and uses something else.
    I still think it's worth thinking about the 1360x768 vs 1366x768 issue as there's a good chance that's the problem. Why the TV initially connects at 1366x768, I'm not sure, but how did you select 1366x768 in Windows? Did you create a custom display resolution in the Nvidia Control Panel, as 1366x768 isn't a standard resolution for Windows/HDMI. Or is Windows connecting at 1360x768?

    If you select 1280x720 as the TV resolution in Windows and reboot, does the TV switch to 1280x720 when Windows loads as it should? I'm not clear on whether that was working correctly. If it isn't then the problems possibly not the 1366x768 resolution.

    Something else to consider.... You might want to use the "screen fit" option when connecting a PC to the TV (no overscanning), but according to the manual screen fit is only available when the following inputs/resolutions are used (my TV is the same).
    Component (1080i, 1080p)
    HDMI (720p, 1080i, 1080p)

    You'd assume the native resolution of 1360x768 wouldn't overscan, but it's not listed as a "screen fit" resolution so it might be something to keep in mind..
    Quote Quote  
  17. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by tloft View Post
    i quoted the above directly from the samsung site, i dont see the hang up on 1360x768 vs 1366x768, but yes. both have that listed as a supported resolution.

    afaik the tv would NEVER show picture in an unsupported resolution, yet it will show 1366x768@60hz when it starts

    i really seems like a driver issue, but i dont know what else to try. i've done all suggested, used a driver sweeper program as well, nothing changes for that section. still open to try anything else. prob wait another month before i just buy a dvi computer monitor, but obv would rather not have to do that
    1366x768 would be a close to a perfect 16:9 aspect ratio, but a widescreen TV's actual resolution may be a bit different. Like your Samsung TV, the actual resolution of my LG 32LK33 is 1360x768 @60Hz.

    The manual for your TV recommends:

    1. Set your video card to use 1360x768 @60Hz
    2. Edit the name (Menu> Input > Edit name) for the HDMI port to use the correct setting for the PC:

    When connecting a PC to the HDMI In 1(DVI) port with HDMI cable, you should set the TV to PC mode under Edit
    name.

    When connecting a PC to the HDMI In 1(DVI) port with HDMI to DVI cable, you should set the TV to DVI PC mode
    under Edit name.

    The manual doesn't say anything about this, as far as I could tell, but if you get a picture and part of the PC screen is not visible, then try using the P.Size button on the remote to set the picture mode to "Screen Fit" to see if it corrects the problem
    Quote Quote  
  18. i am 100% sure of the 1366, its listed in NCP as the native res without me changing or creating it. 1360x768 does not work either.

    i have rebooted from safe mode/driverless with the last res used being 1280x720 and still would default to 1366x768 in the bios, then go blank at user login.

    i'll look for that era drivers. i usually dont update either, but since i was still using ones from 2009 it seemed prudent.

    im not too worried about fixing everything at once, and if i understand correctly, overscan will still show a working picture. i would tackle that later if i get that far
    Quote Quote  
  19. just saw your reply, changed the name to "DVI PC" and tried 1366x768, 1360x768, and 1280x720, all to no avail.

    is there any reason that a reboot would be necessary between attempts? i cant think of one personally.

    i also dont recall seeing any of that in the manual that it was provided with
    Quote Quote  
  20. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by tloft View Post
    just saw your reply, changed the name to "DVI PC" and tried 1366x768, 1360x768, and 1280x720, all to no avail.

    is there any reason that a reboot would be necessary between attempts? i cant think of one personally.

    i also dont recall seeing any of that in the manual that it was provided with
    Too bad nothing worked.

    I found a manual to download by clicking the "Support" tab on the page you linked to, and then clicking the "Manuals" tab on the support page. The instructions for re-naming the HDMI connection was on page 10, and the optimum resolution and refresh rate were on page 9.

    I forgot to mention that the HDMI connection labled "HDMI/DVI In" should be used for connecting a PC, but you probably already know that.
    Quote Quote  
  21. yeah, but thanks.

    in all honesty im just thinking of scraping this all together and using my laptop to the tv as it has hdmi out. laptop has a qc i7, more ram, widescreen on its own monitor, but desktop has a much much better vid card and soundcard that sounds so nice through my headphones, and it seems better to have it do something rather than just sit dormant in my house. still gonna look around for a couple more days though..

    to add more, has anyone used mouse without borders? considered that as a way to add monitor space and split the workload but i dont know how it works for real time applications and more than filesharing. dont need much more hdd space with 2.5tb in my room..
    Quote Quote  
  22. Sorry, but I'm out of ideas at the moment. Windows will happily switch the TV to whichever resolution I've selected after it's loaded. Why that's not working for you and the TV is switching itself off, I've no idea.

    Have you tried a different HDMI input on the TV? I know the manual is all specific about having to use the dedicated PC/HDMI input, but I'm not sure why. The others work. Maybe they don't support all the non-TV, PC resolutions. I can't remember, but it might be worth a try.

    I wonder if it could be a cable issue? HDMI cables tend to either work or not, and yours is obviously working when the PC initially boots, but there's apparently some sort of communication breakdown, given Windows thinks the TV is VGA.
    Quote Quote  
  23. well, if i select the "HDTV" option in NCP instead of "samsung", it lists it as a component connection, even though the comp only has dvi out and i dont have component cables. it wont let me select the type although i've seen many screenshots in faqs that show a dropdown to select the type of connector

    i believe the first time i tried i had it plugged into hdmi 2 before realizing the hdmi 1 was named hdmi/dvi

    i kinda wondered if it was the cable, it was 78 cents on amazon for a 6ft cable which is absurdly cheap, but i thought if it didnt work it wouldnt work at all, not how its sort of selective.
    Quote Quote  
  24. For the record, I just checked the manual for my Plasma (PS51E550D) and 1366x768 is listed as a supported refresh rate, not 1360x768. Go figure.....
    I'm sure Windows lists it as 1360x768 (I can't check as I'm using a different PC at the moment) but whether it connects at 1366x768 or 1360x768, it appears you're probably right..... "1366x768" isn't the problem.
    Quote Quote  
  25. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    Well having exhausted most of the possibilities I would now try one of the more obvious ones.

    There is a note on page 8 of the manual regarding of type of hdmi connection and the possibility of lack of picture.

    By your own admission, you are using a very cheap cable. Is this the correct hdmi type ?. I would have thought that you would need hdmi-2 at the least.

    Samsung show accessory cables (not supplied) but can they be obtained ? But at the very least I would source a replacement cable.
    Quote Quote  
  26. im assuming you mean 1.2 and not 2.0

    i have an extra hdmi cable that is maybe 2 years old, but was pretty good at the time. works fine in 1080p from my laptop so i would assume that would be fine.

    would you assume this would not cause a problem:
    http://www.amazon.com/Eforcity-HDMI-F-DVI-M-Adaptor-Contacts/dp/B000E8X5Z0/ref=sr_1_7?...ds=dvi+to+hdmi
    Quote Quote  
  27. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    Well that is the type of adapter that came in the box with my ATI Radeon card. I now use it to connect the card to my fairly new (1 month old) hd monitor.

    I typed the comment about the hdmi standard quite quickly so you are probably right there.
    Quote Quote  
  28. yeah i had the dvi-vga ones for all prior monitors. just wasnt sure if one of the many dvi types was better than the rest or if this was all encompassing, etc. think i'll try this one then, never knew how necessary widescreen was til i went back to not having it lol...
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!