VideoHelp Forum
+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 34
Thread
  1. Member brassplyer's Avatar
    Join Date
    Apr 2008
    Location
    United States
    Search Comp PM
    Even 4K is nowhere near as clear as looking around in the real world. Pretending cost no object, do you think it would be possible to create video that's indistinguishable from looking out a window?
    Quote Quote  
  2. Any resolution is essentially "reality-level" if you are far enough from the display. So the question doesn't make sense without indicating the distance from which you'll be watching.

    The other issue is that the contents of the display doesn't change as you move around the room. So it's obviously not the same as looking out a window.
    Quote Quote  
  3. Member brassplyer's Avatar
    Join Date
    Apr 2008
    Location
    United States
    Search Comp PM
    Originally Posted by jagabo View Post
    Any resolution is essentially "reality-level" if you are far enough from the display. So the question doesn't make sense without indicating the distance from which you'll be watching.

    The other issue is that the contents of the display doesn't change as you move around the room. So it's obviously not the same as looking out a window.
    Assume everything scaled identically, viewing from one vantage point.
    Quote Quote  
  4. Member
    Join Date
    Jul 2007
    Location
    United States
    Search Comp PM
    Depth of field and color are additional factors to "reality level" video.

    The moment you switch your focus away from the foreground it becomes apparent that you can't properly focus on distant objects and even Rec. 2020 doesn't cover the full visible color spectrum.
    Quote Quote  
  5. Originally Posted by brassplyer View Post
    Originally Posted by jagabo View Post
    Any resolution is essentially "reality-level" if you are far enough from the display. So the question doesn't make sense without indicating the distance from which you'll be watching.

    The other issue is that the contents of the display doesn't change as you move around the room. So it's obviously not the same as looking out a window.
    Assume everything scaled identically, viewing from one vantage point.
    Again, it's a function of the size of the screen and the viewing distance.
    Quote Quote  
  6. Member
    Join Date
    Aug 2010
    Location
    San Francisco, California
    Search PM
    You would need an Giga-HD volumetric display.
    Quote Quote  
  7. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    You haven't even taken into consideration dynamic range (HDR and beyond), Framerate (80+ fps), and stereoscopic disparities, in addition to those things already mentioned.

    Scott
    Quote Quote  
  8. Renegade gll99's Avatar
    Join Date
    May 2002
    Location
    Canadian Tundra
    Search Comp PM
    The frequent description of windows or ship style view ports on UFO have made me question the validity of such sightings. Wouldn't an advanced species capable of interstellar flight have removed this need by now and replaced it with real world internal projections of the outside world.

    It seems that advanced flight simulators are getting close to what you are asking but the subject viewer is in a fixed position and generally the scenes are fairly distant. Similar home use VR is another area that is bound to expand, grow and improve to the point of creating a full true to life reality but right now at least VR keeps you in a fixed position (your eyes) relative to the view screen through the use of goggles. These may not be quite real life yet but it's just a matter of time.

    Watching television from a fixed location with a limited perspective such as a boxed in home aquarium in 4k or 8K 3D (no glasses) might look real life enough to someone if properly staged. It's believable that you could fool someone now with a fake window on a wall looking out at a distant field, lake and mountain outdoor scene using a lightly concave (shield shaped) screen mounted possibly 2-3 feet deep so a person couldn't get too near to change their viewing position too drastically. It would certainly depend on your requirements and since real 3D perspective is missing in all directions a smaller screen which limits the viewer's left-right-up-down movement would probably work better than a larger one.

    As far as general home use it will probably require holographic or some other shadowbox type of layered display technology to fully eliminate depth and close up perspective issues.
    There's not much to do but then I can't do much anyway.
    Quote Quote  
  9. Member
    Join Date
    Jul 2007
    Location
    United States
    Search Comp PM
    This leads to an interesting experiment that's been conducted to a certain extent in the field with people being exposed for the first time to photos and videos. Would someone not exposed to existing UHD technology be able to distinguish a displayed image from an actual window scene.

    As we experience higher quality (i.e. higher resolution, more color accurate, artifact free) video, our eyes become more attuned to details that were previously overlooked. I remember my first experience with an ED-Beta demo tape on my XBR-Pro and being shocked at the "realism" of the display that pales compared to what we have today. The "shock" and wonder lessened each time as I moved on to DVD and my first HDTV, then Blu-Ray and someday UHD and beyond.

    It's said that 8 or 16K is the limit of the human eyes ability to distinguish resolution, but I'm sure a properly trained eye and mind can always distinguish an artificial display from reality.
    Quote Quote  
  10. Member Bernix's Avatar
    Join Date
    Apr 2016
    Location
    Europe
    Search Comp PM
    It depends on lots of factors that mostly was mentioned earlier. Google says this. That calculation are according to field of view is 16/9. 576/324 = 1,777777777777777777777778-
    https://www.infocuriosity.com/what-is-the-resolution-of-human-eye-how-many-megapixels/
    But I think they have terrible error in calculating Megapixels. It doesn't make much sense at least to me...
    There is also mentioned, that 4K is good for majority.
    So it is some sort of advantage to have not perfect eyes an perfect ears, when comes to digital audio/video.

    Bernix
    Last edited by Bernix; 12th Jan 2018 at 04:16. Reason: But I think ...
    Quote Quote  
  11. VR even today is capable to generate fully immersive visual environment - i would say that nowadays resolution is least challenging problem for reality...
    Quote Quote  
  12. Member
    Join Date
    Jul 2007
    Location
    United States
    Search Comp PM
    More thoughts about the issue of depth of field.

    What is the highest quality visual medium available today that doesn't require the limited field of view of VR? Probably IMAX / Omnimax? I'm sure the technology has improved over the years, but 25 years ago I used go love to go to the Omnimax in Las Vegas where they would show virtual rollercoaster, race car and flyovers. Very realistic (with the addition of motion timed chair movement), but if you lost focus of the foreground image, the realism faded.
    Quote Quote  
  13. Member
    Join Date
    Aug 2010
    Location
    San Francisco, California
    Search PM
    The way to really do this is to skip the "display" and pipe sensory input directly to the visual cortex of the brain.
    Quote Quote  
  14. You can pretty easily determine for yourself how much resolution is needed to exceed your visual acuity. View an image with single pixel thick vertical and horizontal black and white lines pixel-for-pixel on your display. Move back until you can no longer tell there are individual black and white lines -- just grey. Here's a 1920x1080 sample image you can use.

    Click image for larger version

Name:	acuitiy.png
Views:	283
Size:	8.3 KB
ID:	44384

    The top left block is alternating vertical lines. At the bottom right is alternating horizontal lines. In the center is a checkerboard pattern. At the top right and bottom left are 50 percent greyscale patches for comparison. Make sure you monitor/TV is displaying the image pixel-for-pixel -- get very close and verify you can see the individual lines and pixels with no distortions.

    It's hard to say exactly when the lines disappear, but for me, on a 24 inch (diagonal) 1080p monitor I have to be about 6 feet away. With a 1080p 46 inch TV I have to be about 12 feet away. So for a 1080p display you need to be about 3x the diagonal measure of the display.

    But video is never as sharp as that image. If it was you would see severe moire artifacts and buzzing edges when things were in motion. To reduce those problems you need to reduce the sharpness of the content by about 30 percent (the Kell factor). So you would have to sit about 30 percent farther away from the display to get the same effect -- say, 4x the diagonal measure of the display.

    A UHD (3840x2160) display has twice as many pixels horizontally and vertically. So you can be half the distance from the display before starting to see individual pixels. Ie, 2x the diagonal measure after accounting for the Kell factor. So a 46 inch diagonal 16:9 UHD "window" would have to be viewed from about 8 feet away. If you want to view from 4 feet away you twice as many pixels per axis -- a 7680x4320 display.

    But keep in mind, the alternating full black and full white lines is something of a worst case scenario. Human eyes have much less resolution for colors and lesser contrasts. So in a real world situation you don't need this much resolution.
    Last edited by jagabo; 12th Jan 2018 at 12:15.
    Quote Quote  
  15. Member brassplyer's Avatar
    Join Date
    Apr 2008
    Location
    United States
    Search Comp PM
    Originally Posted by JVRaines View Post
    The way to really do this is to skip the "display" and pipe sensory input directly to the visual cortex of the brain.
    I.e. an eyeball.
    Quote Quote  
  16. jagabo
    It is even important how are those patterns organized. Even if there is the same amount of black and white pixels, like from a distance I can see that image with horizontal lines as more whitish as oppose that checker box or box with vertical lines, both seem to look more grayish.


    Anyway, I recently check latest computer games and I was stunned. Games work with polygons - they simplify surface to many triangle shapes and it depends only on computing power how small those triangles are going to be, possibly all the way down to the screen resolution, where that triangle becomes a point.
    I saw Zero Dawn Horizon and I was simply floored what I could see. (Real time render with reasonable processor, in that case PS4). Rendering plants, mountings, deserts, sunsets, sunrises with real shadows. Player could be moved anywhere (within zone) and camera could be moved anywhere as well. And we are only in computing era for what? 30 years. Cannot imagine where it is heading.
    Quote Quote  
  17. Originally Posted by _Al_ View Post
    Cannot imagine where it is heading.
    I hope a "Holodeck" , Star Trek style
    Quote Quote  
  18. Member Bernix's Avatar
    Join Date
    Apr 2016
    Location
    Europe
    Search Comp PM
    Hi _al_
    Compute gaming era seems to me be more accurate to 30 years. Space invaders will be such old, presume.

    Bernix
    Quote Quote  
  19. Member
    Join Date
    Jul 2007
    Location
    United States
    Search Comp PM
    Originally Posted by Bernix View Post
    Hi _al_
    Compute gaming era seems to me be more accurate to 30 years. Space invaders will be such old, presume.

    Bernix
    We're getting off-topic, but Space Invaders is 40 years old, introduced in 1978. Home computer gaming with graphics started around the time, so ~40 years. Mainframe computer gaming goes back to the early 1950's.

    1972's Pong is generally accepted as the beginning of the arcade / home gaming era, though 1971 had Computer Space released in numbers.
    Last edited by lingyi; 12th Jan 2018 at 16:41. Reason: Addtional info. pandy posted about early computer gaming while I was editing.
    Quote Quote  
  20. Human fovea "pixel" distribution is not regular "sampling" grid (like commonly seen on computer graphic devices) but random, also distribution of "pixels" in fovea is non-linear (more pixels is close to blind spot). Side to this human vision systems is made from two main types of vision - first sharp/detail oriented but with relatively low "refresh" rate and second peripheral vision way lower resolution but with higher "refresh" rate - only addressing both types of vision reality level is reachable. This leads to:
    • significantly higher (at least twice higher - Nyquist criteria) resolution than maximum "pixel resolution" in fovea,
    • second condition is very high refresh rate - that's why ultra high frame rate video (300 - 600fps but some studies mention even over 1000fps i.e) is next step in future video development.

    Btw computer games can be dated at beginning of 50's...
    Quote Quote  
  21. Member
    Join Date
    Jul 2007
    Location
    United States
    Search Comp PM
    Originally Posted by brassplyer View Post
    Even 4K is nowhere near as clear as looking around in the real world. Pretending cost no object, do you think it would be possible to create video that's indistinguishable from looking out a window?
    To get back to this original question, based on the answers given, the bottom line is it depends on the viewer.

    For the average viewer, a current UHD HDR display would probably be deemed "looking like the real world".

    For many members here with their knowledge and trained eyes, no display technology would be able to match "real world" imagery.
    Quote Quote  
  22. Originally Posted by _Al_ View Post
    jagabo
    It is even important how are those patterns organized. Even if there is the same amount of black and white pixels, like from a distance I can see that image with horizontal lines as more whitish as oppose that checker box or box with vertical lines, both seem to look more grayish.
    Some displays have problems like that. It depends on how the RGB triplets are organized on the panel. Old CRT displays often couldn't switch the electron beam on/off fast enough to support the highest resolutions and refresh rates -- so the vertical lines wouldn't how up well. Gamma issues will also cause the patches to have different levels of brightness.

    But for your ability to reslove detail the different gamma doesn't matter, it's when you can no longer distinguish the lines/pixels. And this is why the pattern has both vertical and horizontal stripes, and a checkerboard pattern.
    Last edited by jagabo; 12th Jan 2018 at 21:57.
    Quote Quote  
  23. Member
    Join Date
    Aug 2010
    Location
    San Francisco, California
    Search PM
    Originally Posted by brassplyer View Post
    Originally Posted by JVRaines View Post
    The way to really do this is to skip the "display" and pipe sensory input directly to the visual cortex of the brain.
    I.e. an eyeball.
    Except an eyeball can only lens the light that reaches it. We could generate nerve signals synthetically and send them to the brain without the light ever actually existing.
    Quote Quote  
  24. Originally Posted by jagabo View Post
    Old CRT displays often couldn't switch the electron beam on/off fast enough to support the highest resolutions and refresh rates -- so the vertical lines wouldn't how up well.
    Trust me, that CRT main problem was not beam switching frequency - resolution in color displays was mostly limited by phosphor and mask and it was rather mechanical than electronic-al. In fact CRT's are still used today in researches on human vision system due less limitations inherently related to digital display devices.
    Quote Quote  
  25. Member brassplyer's Avatar
    Join Date
    Apr 2008
    Location
    United States
    Search Comp PM
    Originally Posted by JVRaines View Post
    Originally Posted by brassplyer View Post
    Originally Posted by JVRaines View Post
    The way to really do this is to skip the "display" and pipe sensory input directly to the visual cortex of the brain.
    I.e. an eyeball.
    Except an eyeball can only lens the light that reaches it. We could generate nerve signals synthetically and send them to the brain without the light ever actually existing.
    I've heard of very crude images being transmitted but I'll bet what you're describing won't be possible to do within the lifetime of your great-great grandchildren. For now, a human eyeball is the only way.
    Quote Quote  
  26. Member darkknight145's Avatar
    Join Date
    Feb 2007
    Location
    Australia
    Search PM
    No it's not possible!
    Monitors and such are a defined resolution as the image is made up of dots/pixels. Real vision is not made of dots/pixels and resolution is basically infinite. Also bear in mind that your real vision is only clear exactly where you are looking, all the peripheral scenery is actually blurry with the brain making you think that it is actually clear by fill ing in the gaps.
    Quote Quote  
  27. Originally Posted by darkknight145 View Post
    No it's not possible!
    Monitors and such are a defined resolution as the image is made up of dots/pixels. Real vision is not made of dots/pixels and resolution is basically infinite. Also bear in mind that your real vision is only clear exactly where you are looking, all the peripheral scenery is actually blurry with the brain making you think that it is actually clear by fill ing in the gaps.
    Are you sure? Your claims are not in-line with scientific knowledge https://en.wikipedia.org/wiki/Fovea_centralis#Angular_size_of_foveal_cones . And brain is completely different story - brain can even create non existing images (dreams), at extreme case you can even being unable to distinguish where is your imagination and where is reality (insanity).
    Quote Quote  
  28. Member Bernix's Avatar
    Join Date
    Apr 2016
    Location
    Europe
    Search Comp PM
    I think that you blink once per 3-6 secs. Imagine if eye -> brain let you know about every blink you made. I think it is another proof that eye -> brain isn't perfect and it is good at least for this particular case. If you concentrate you will noticed it, but better not. And duration of blinking is about 100-150 millisecond and other source 100-400 millisecond. It is 1/10th of seconds. So if you are watching movie at 25fps, you don't see 2,5 frames each 3-6 seconds, because blinking and in 400ms case 10 frames... But 400ms seems to me like must be noticed blink

    Bernix
    Quote Quote  
  29. Originally Posted by darkknight145 View Post
    Real vision is not made of dots/pixels and resolution is basically infinite.
    Yes, this is why you can read a book from a mile away.
    Quote Quote  
  30. Member Bernix's Avatar
    Join Date
    Apr 2016
    Location
    Europe
    Search Comp PM
    Here is some interesting math about seeing on long distance.
    But it is well known.

    Bernix
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!