I have been googling but i still find myself confused on the matter. What is true 4K resolution and there is a difference with Ultra-HD
Why do i care?
First off i need to know which resolution to set on PC so as to have 1:1 output on my 4K TV.
-Secondly i got an official samsung usbc-hdmi adapter for use with my phone, yet my phone output 4k 25hz on the TV, and wonder wheter this could be due to the adapter supporting 60hz up to 3840, but since 4096 is forced it drops to 25...
My TV is the Sony 55XE7096
https://www.sony.ee/electronics/support/televisions-projectors-lcd-tvs/kd-55xe7096/specifications
Display resolution (H x V, pixels)
3840x2160
Video signal support
4096x2160p(24,50,60Hz), 3840x2160p(24,25,30,50,60Hz), 1080p(30,50,60Hz), 1080/24p, 1080i(50,60Hz), 720p(30,50,60Hz), 720/24p, 576p, 576i 480p, 480i
Picture modes
Vivid,Standard,Custom,Cinema,Sports,Photo-Vivid,Photo-Standard,Photo-Custom,Game,Graphics,HDR Vivid,HDR Video
HEVC Support (Broadcast)
Yes (Up to 3840x2160/60p 10bit)
+ Reply to Thread
Results 1 to 22 of 22
-
-
PC output is normally set to the TV's native resolution at 59 Hz, 60Hz or 50Hz, depending on the preferred refresh rate of the TV when it is used as a PC monitor. 4K/UHD TVs, including yours, have a native resolution of 3840 × 2160. Your TV's preferred refresh rate is apparently 60 Hz.
3840 × 2160 (UHD 4K) is also the 4K standard for consumer video and broadcast TV. The 4K standard for digital cinema is 4096 × 2160 (DCI 4K).
The phone outputs 4096 × 2160 @ 25Hz because the TV, adapter, and phone all support 25Hz at that resolution, although it isn't one of the TV's preferred refresh rates for that resolution. 25hz may be a limitation imposed by the phone or the adapter. You haven't provided any information about the phone.
Ignore list: hello_hello, tried, TechLord, Snoopy329 -
ITs galaxy note 8, and the official samsung adapter which advertises 4k@60. Weird thing is the TV shows 4096 when connecting the phone, and also i get the 4096 on the pc as an available selection.
I get that the TV can accept 4096 signal and convert it to 3840, but if that is the native resolution why does it display 4096 as current input? -
ALL consumer "4k" TVs are 3840x2160, regardless of what notifications you may be getting from it or any attached devices. Those notifications were probably meant to cater to the masses who don't know the difference. Aka, they fudge the truth. 4k is borrowed from the DCI usage. If it's on a "TV", it's UHD.
Now, computer monitors can be capable of that pixel count. And if they ever mass distribute 8k TVs, those could be capable of showing full 4k material, but you won't be getting that from commercial sources as they don't want you to have their original quality. It would come from cameras.
As I think about it, it could be that those notifications are referring to the agreed-upon handshake resolution of the hdmi transfer, as that might be the closest common denominator of both devices.
Scott -
How do yoy mean? Like if you see on the top left youll see what i mean
[Attachment 47647 - Click to enlarge] -
Because that's what it is receiving. What does it say when you set the computer to some other resolution, like 1280x720. Would you expect it to say 4096x2160 or 3840x2160? Of course not.
Make yourself some test videos to verify when you are getting pixel-for-pixel display. -
Ignore list: hello_hello, tried, TechLord, Snoopy329
-
Stop using the term "broadcast" willy-nilly and universally. Your device doesn't broadcast anything, neither OTA tv spectrum nor Wifi multicast, etc.
Your device transmits one-to-one with a corresponding receiver in coordinated handshake/negotiation, using the accepted protocol standards that they both mutually support (not counting intermediate active adapters).
So it depends on the device and the protocol and the standard.
Not counting standard commercial OTA/Cable broadcasting, the TV basically only supports HDMI wired reception or it uses an internal general media player to receive network streaming or play local usb content.
It's possible that the latter 2 are in play, and it's a wild west as far as capability & compatibility there, though you may have hit upon a good combination.
But much more likely, this is simply HDMI negotiation where it goes something like this:
Tv: "I use 3840x2160 natively, but support scaling of 4096, 1080,..."
Phone: "I can't give you 3840, but I can give you 4096 and you scale down/crop."
Tv: "I need 24, 50, or 60Hz fps, but can interpolate."
Phone: "ok, I can give you 25fps, and you frame double it to 50."
Tv: "agreed?"
Phone: "agreed!"
Your phone doesn't output copy protected material that way, but if what you are showing is homemade footage, it doesn't matter.
Scott -
Well I wonder whether the phone can give 3840 but it just gives 4096 because its the max the tv can receive. So i would imagine something like this
-Phone whats the max res you can receive
-Tv its 4096x2160 although i will convert it to 3840x2160
-Phone okay i will give you the 4096 and run out of bandwidth so you ll get a choppy 25fps. Agreed?
-Agreed -
Thanks for that.
The Galaxy Note 8's specs say "With cable: supports DisplayPort over USB type-C. Supports video out when connecting via DisplayPort cable (DisplayPort 4K 60 fps)" (I think the cable Samsung is discussing is a USB-C DisplayPort Alternate Mode to Displayport cable, like this https://www.amazon.com/Plugable-USB-DisplayPort-Adapter-Resolutions/dp/B01EXKDRAC, and not an active adapter like yours,)
I looked at it to be certain, and sure enough, Samsung's specs for its USB-C DisplayPort Alternate Mode to HDMI active adapter say "supports "signals up to 4K at a 60Hz refresh rate"
Unfortunately, Samsung never defines precisely what is meant by 4K for your phone's USB C DisplayPort out, or your adapter's HDMI out, so it remains unclear whether they mean 3840x2160 UHD @ 60Hz or 4096x2160 DCI 4K @ 60Hz. I'm guessing that Samsung means UHD, and in the case of the phone and the adapter, they are trying to deliver 4096x2160, the maximum resolution that the TV supports, but are unable to supply that at a refresh rate greater than 25Hz.Ignore list: hello_hello, tried, TechLord, Snoopy329 -
If i only had access to a monitor that supports up to 3840 so that this is forces as maximum rather than 4160 so as to test this theory and put everything to rest
-
There's nothing to put to rest. The source and the sink negotiate a resolution that's acceptable to each. Or some sources can be configured to output a specific format. The TV's display indicates what was negotiated or the fixed format the source put out (if it supported that format). End of story.
Of course, just because the TV's native resolution is 3840x2160 and the source is putting out 3840x2160 doesn't mean you're getting pixel-for-pixel mapping. Your TV may be set to simulate overscan or otherwise scale (zoom in, zoom out, etc.) the incoming video. Even sources can be set to scale the video before sending it to the sink. That's why you need to use test videos to verify pixel-for-pixel display. -
-
You just may have bought yourself a TV that might not provide 1:1. Most displays have something that hints at one of its screen modes being 1:1 ("JustScan", "Dot-by-Dot", "1:1", "Exact Scan",...). Don't see any listing like that for yours, so just sayin'.
Just depends - you probably need to hook up a PC with a 4k-capable video card & hdmi 2.2, see if you can force the rez to true dci 4k (4096) at whatever fps. Then make sure you're hooked up via the "HDMI (PC)" port on the TV and cycle between "Normal" and "Full 1" and "Full2", as per pg 15 of your manual. See if text (from word doc, pdf, html, desktop menus, etc) on one of those at 4k is razor sharp (or get a hirez png reference test chart with 1pixel lines, showing fullscreen and test). If so, that is likely the 1:1. If there's always some blurriness or moire, likely your TV cannot do 1:1.
You also likely may not be able to do 1:1 on the OTHER HDMI ports, by design.
Yes, you will need a verified 4k output capable machine to test it properly. If you cannot get hold a one, you'll always just be guessing.
Scott -
Use test patterns where anything other than pixel-for-pixel display will be obvious. For example, alternating single pixel thick vertical black and white lines across the entire frame. If there's any kind of scaling there will be distortions. With pixel-for-pixel display you should be able to see each and every line when you get up close.
Attached is an 3840x2160 mp4 file I just made, with horizontal and vertical lines, and reticles at the edges to detect cropping. -
TV at least does support all of those advertised resolutions. As you can see i made sure with a high end graphics card.
[Attachment 47731 - Click to enlarge] -
"TV at least does support all of those advertised resolutions. As you can see i made sure with a high end graphics card."
. I will check the test pattern when I am back home from holidays
-
Very likely they also do (in at least these few instances, but possibly more) DOWNSCALING, as well. Since downscaling is known to retain perceiveable sharpness, this isn't surprising how it would still look good.
Scott -
Very likely they also do (in at least these few instances, but possibly more) DOWNSCALING, as well. Since downscaling is known to retain perceiveable sharpness, this isn't surprising how it would still look good.
Scott
I hope the TVs will not follow the road of point cameras with interpolated and real resolutions thingLast edited by mammo1789; 8th Jan 2019 at 18:48.
-
Similar Threads
-
Watching 4K (3840x2160) videos on 10.1 tablet
By enim in forum Newbie / General discussionsReplies: 24Last Post: 7th Feb 2014, 09:04