VideoHelp Forum
+ Reply to Thread
Results 1 to 17 of 17
Thread
  1. Hello all, I have a stupid question, I have 4k tv connected to my PC trough HDMI 2.0, but the resolution set from my PC is set to 1080p instead of 4k (I was advised by AMD support I should put my PC screen and TV on same resolution with same refresh rate)
    Now the question is, if I play a 4k movie trough a player (VLC) to my TV what will be the resolution I will see 1080p or 4k
    Quote Quote  
  2. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    If the TV automatically upscales the picture, you will see the 1080p picture output by your computer upscaled to 2160p by the TV. If the TV does not automatically upscale the picture, you will see a 1080p rectangle in the center of the screen surrounded by black on all sides. In that case you would use the picture/picture size button on the remote to zoom the picture to fill the screen.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  3. Either I didn't get your response or I didn't ask the correct question, sorry.

    I will try to reduce this to dots. For example my tv screen can show 8*294*400‬ dots (3840 × 2160), from the windows I reduce the TV resolution to 1920 x 1080,
    but I play video that has 8*294*400 dots, am I going to see the video 3840 × 2160 (all 8*294*400‬ dots) or I am going to see it as 2*073*600 dots (just on full screen)
    Quote Quote  
  4. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    The answer that usually_quiet was quite clear. But I'll try it a different way.


    If your computer's graphics card has a maximum resolution of 1920*1080 DOTS there is no way, without the internal adjustment of the tv, to see your 4K (3840*2160 DOTS) video as it is intended.


    If the graphics card supports 3840*2160 you configure it for that.
    Quote Quote  
  5. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by Pav4o View Post
    Either I didn't get your response or I didn't ask the correct question, sorry.

    I will try to reduce this to dots. For example my tv screen can show 8*294*400‬ dots (3840 × 2160), from the windows I reduce the TV resolution to 1920 x 1080,
    but I play video that has 8*294*400 dots, am I going to see the video 3840 × 2160 (all 8*294*400‬ dots) or I am going to see it as 2*073*600 dots (just on full screen)
    Your thinking about this is wrong. The TV's resolution will still be 3840 × 2160 even though you selected 1920 x 1080 as the screen resolution in Windows. Windows settings don't change the TV's resolution. Windows settings change the output resolution from the computer's graphics card.

    The TV itself will either upscale the 1920 x 1080 picture information from PC graphics to 3840 x 2160 to fill the screen or it will display a 1920 x 1080 picture surrounded by black borders. If the TV does not upscale the picture to fill the screen then you must use a button on the TV's remote control to enlarge the picture to fill the screen.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  6. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Maybe this will make it clearer:

    If you have a 3840x2160 tv, it will show that many dots.
    However, depending on what you supply it, they may not be UNIQUE dots.
    To get 3840x2160 unique dots of detail, you will need to supply it with 3840x2160 image.

    BTW, if your graphics card truly supports hdmi 2.0, it should also be supporting 3840x2160.

    Am betting you need to update your drivers. Win10 1903 & 1909 have much better 3840x2160 drivers than previously, and many displays that couldn't see the full rez before can now do so with an upgrade.

    Scott
    Quote Quote  
  7. ok this is getting clearer so my PC and TV are supporting 3840x2160 (AMD RX-580), the reason I want to set 1920x1080 is because I was advised by AMD support that my TV and monitor should be set on the same resolution, because when I set the TV to 3840x2160 and I play something on full screen, after i quit it the refresh rate of the TV downgrade automatically to 30Hz instead of staying at 60Hz as was set. Hence the questions I am asking, when I watch a movie that has a higher resolution am I experiencing it or I see it 1080p.
    Or if we can fix the auto refresh rate change, that would be better

    However, depending on what you supply it, they may not be UNIQUE dots.
    To get 3840x2160 unique dots of detail, you will need to supply it with 3840x2160 image.
    Now I guess the question is more software related, am I supplying 3840x2160 from the video file or I am supplying 1080p from the resolution set from windows.
    Quote Quote  
  8. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by Pav4o View Post
    ok this is getting clearer so my PC and TV are supporting 3840x2160 (AMD RX-580), the reason I want to set 1920x1080 is because I was advised by AMD support that my TV and monitor should be set on the same resolution, because when I set the TV to 3840x2160 and I play something on full screen, after i quit it the refresh rate of the TV downgrade automatically to 30Hz instead of staying at 60Hz as was set. Hence the questions I am asking, when I watch a movie that has a higher resolution am I experiencing it or I see it 1080p.
    Or if we can fix the auto refresh rate change, that would be better
    Yes, if you are using the both TV and monitor at the same time in an extended desktop configuration, then you should set your graphics card to output 1920x1080. If you are using only the TV then 3840x2160 would be preferred.

    Originally Posted by Pav4o View Post
    However, depending on what you supply it, they may not be UNIQUE dots.
    To get 3840x2160 unique dots of detail, you will need to supply it with 3840x2160 image.
    Now I guess the question is more software related, am I supplying 3840x2160 from the video file or I am supplying 1080p from the resolution set from windows.
    You are supplying 1080p from the resolution set from Windows. The graphics card is down-scaling the 3840x2160 image to a 1920x1080 image, which causes a loss of 3/4 of the unique dots. The TV creates new dots to replace the ones that were lost when it enlarges the 1920x1080 image to 3840x2160 for display but the TV's up-scaling algorithm can only guess at the original picture detail that was lost.
    Last edited by usually_quiet; 18th Mar 2020 at 09:19.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  9. this makes me sad, but answers my questions, thank you
    Quote Quote  
  10. Originally Posted by Pav4o View Post
    this makes me sad, but answers my questions, thank you
    On the other hand, the fact that you came here to ask implies you didn't notice a significant difference. So why does it matter?
    Quote Quote  
  11. well, it is like you have a 6 gear car, and you can only put on 5th, I know it is not visible with naked eye but when you know it ...
    Quote Quote  
  12. You can always use dual displays without mirroring. Then each display can have its own resolution. Any particular program will run on one or the other, not both. You can drag programs to the one you want.
    Quote Quote  
  13. hm, I never said they are mirrored, I am extending them... but in this mode I am getting the refresh rate downgrade by itself to 30Hz, but this is what usually_quiet said:
    Yes, if you are using the both TV and monitor at the same time in an extended desktop configuration, then you should set your graphics card to output 1920x1080. If you are using only the TV then 3840x2160 would be preferred.
    Quote Quote  
  14. Member
    Join Date
    Aug 2006
    Location
    United States
    Search Comp PM
    Originally Posted by Pav4o View Post
    hm, I never said they are mirrored, I am extending them... but in this mode I am getting the refresh rate downgrade by itself to 30Hz, but this is what usually_quiet said:
    Yes, if you are using the both TV and monitor at the same time in an extended desktop configuration, then you should set your graphics card to output 1920x1080. If you are using only the TV then 3840x2160 would be preferred.
    jagabo is probably correct and I probably misremembered which multi-monitor setting requires the same resolution for both monitors. Unfortunately I don't have a convenient way to test multi-monitor settings right now.
    Ignore list: hello_hello, tried, TechLord, Snoopy329
    Quote Quote  
  15. so in general from my knowledge/experience I should be able to extend the monitors and set them on different resolution with different refresh rate, and the fact that AMD support is advising that I can't for me is crap. And I am still in communication with them
    Quote Quote  
  16. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Is your refresh throttling down to 30 if you ONLY use the TV (disabling desktop screen)?

    Scott
    Quote Quote  
  17. when I unplug the monitor, the TV is keeping the 60Hz (played game, watched movie), the moment I plugged back the monitor, the TV went down to 30Hz
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!