What sort of hardware do we use to make this possible?
amilich
Seems like they are "microlenses" covering OLED displays. Still would love to understand whether they use an array of miniature lenses (which presumably focuses sections of the OLED display) and how that provides a more realistic viewing environment.
What sort of hardware do we use to make this possible?
Seems like they are "microlenses" covering OLED displays. Still would love to understand whether they use an array of miniature lenses (which presumably focuses sections of the OLED display) and how that provides a more realistic viewing environment.
http://lightfield-forum.com/light-field-camera-prototypes/nvidia-near-eye-light-field-display/
What are the barriers to this kind of technology being used today?
This is so cool!
this may be irrelevant but does being nearsighted/farsighted impact the way people see in virtual reality?