+ Reply to Thread
Results 1 to 8 of 8
Good news for the early adopters I guess, but in my opinion it would be better if there was actual content for the monitors. I predict that this will be an epic fail - there is no demand at all for this right now and consumers, some of whom here have mentioned that they still don't have BluRay players, are not interested in yet another video format to have pay extra money for in terms of TV content or discs.
since this is a monitor and not a tv, i don't think the lack of tv content will slow down adoption. this will appeal to gamers and those that stream content to their pc's; netflix is already experimenting with 4k content:
and i know of one chinese broadcast company that has already started streaming all their content in 4k+hevc.
plus there's this reality, there needs to be a "killer app" introduced that reignites demand in high end hardware and thus stimulates growth in the tech sectors.
4k+hevc is that killer app combo that could reignite a new "arms race" and get the public spending again, like 1080p+avc did years ago.
you can check out some 4k content here:
even down scaled via your player from 4k to 1080p you will see how much better it looks than a straight 1080p version.
Last edited by deadrats; 4th Dec 2013 at 09:28.
How will the average movie release look in 4k? Given the "4k mastering" doesn't do much for the quality that I can see, I'm somewhat sceptical, especially as most Bluray video doesn't have 1080p worth of picture detail as it is. 1080p worth of noise maybe.....
So what will 4k give us? Movies containing 4k worth of noise but still only around 720p of picture detail?
but on my setup? there is a definite benefit to 1080p over 720p and the 4k content i have seen, even downscaled, kicks ass.
MPC-HC and run the players maximised on the TV in order to compare them. Well often I compare more than two scripts if I'm considering using noise filtering etc.
99% of the time, in order to be sure which script is 1080p and which is 720p when they're both running full screen on the TV, I need to stop them on identical frames (and I'm viewing the video a couple of feet from a 51" screen). When I do, if I can see what appears to be a loss of fine detail, much of the time it's because I use a soft resizer with MPC-HC. If I change to a bicubic resizer, quite often that fixes the apparent loss of fine detail, but mostly it's nothing I'd see while the video is actually running, and back at "normal" viewing distance.....
If I had to take a guess, I'd guess I've gone with 720p around 80% of the time. Most of the rest would be 1080p, but I've compromised and gone with 900p a couple of times, and at least once I went all the way down to 540p (old black and white movie).
What player do you use for watching encodes? I've sometimes wondered what sort of resizing a player/TV would use for upscaling and how sharp it might be, but as I use my PC for playing video I've never gotten around to comparing them (although I'm certain the Sony Bluray player in this house has a more natural looking picture than the Samsung player). It'd be hard to do reliably anyway as the TV takes too long to switch between HDMI inputs.
Honestly.... if most Bluray video has 1080p worth of actual picture detail rather than 1080p worth of noise, I've had a really bad run.
Most of you have skipped over the fact that apple and monoprice have been selling monitors that use LCD panels with the same or similar resolution. Though these are larger then the ones monitor I have seen. In my experience dell is only good with computers and the associated components that they don't do significant engineering on