Imagining Possibilities With Apple Vision Pro

Apple Vision Pro headset in profile view, battery attached.

This week, Apple announced its new “spatial computing” device, Vision Pro. It’s a mixed reality headset that seamlessly blends elements of augmented reality (AR) and virtual reality (VR), allowing users to superimpose digital elements over the real world or fully immerse themselves in a virtual space. As Tim Cook stated during the WWDC keynote announcement, “It’s the first Apple product you look through, and not at.”

Unlike other headsets that focus more on gaming (and the metaverse), Apple’s Vision Pro feels more like a potential replacement for your computer, tablet, and television, thus earning the “spatial computing” label.

While there are still some unanswered questions regarding the functionality and specifications of the Vision Pro, after watching and reading about it the past few days I’ve come up with some initial ideas for how they could improve this first-generation device.

Note: It’s possible that these ideas are features that have yet to be announced or marketed.

EyeSight

One of the most unique features of the Vision Pro is called EyeSight. When someone is nearby, a display on the outside of the device projects a real-time capture of your eyes, creating the illusion that people looking at you can see through the headset to your face.

During app usage, such as reading a text message, the display of your eyes is overlaid with transparent colors to indicate your focus is partially elsewhere. When fully immersed in an experience, where the outside world is not visible, the colors completely cover your eyes.

When I saw this, I immediately thought it would be worth exploring additional ways to utilize this external display.

Personalization

Instead of displaying realistic eyes, what if you could personalize it with fun animations? Imagine LED-style hearts, words or phrases, or emojis.

Facial Expressions

Taking it a step further, what if the display changes based on your facial expressions? For instance, if you look sleepy, it could show Z’s. If shocked by a photo sent to you, it could display X’s.

App-Based

Currently, the display changes depending on whether you are fully immersed or not. However, what if app developers could customize what is shown based on what’s being experienced? While watching the show The Mandolorian, it could display images of stars and galaxies. If enjoying the movie Dune, it could show a desert environment. And if you’re playing a Batman video game, it could reveal the iconic “Bat Signal.”

The trick here, though, is that you might not want people to have any idea what you are doing on your headset, so there needs to be a setting to disable developer-controlled animations if this were to actually be implemented.


Continue reading “Imagining Possibilities With Apple Vision Pro” on Medium.