It’s rare to find a new technology that feels groundbreaking. But last night, while sitting on a couch in a private demo room at Apple’s campus wearing its newly announced Vision Pro mixed reality headset, it felt like I’d seen the future — or at least an early and very pricey prototype of it.
In the demo, which lasted 30 minutes, a virtual butterfly landed on my finger; a dinosaur with detailed scales tried to bite me; and I stood inches away from Alicia Keys’ piano as she serenaded me in a recording studio. When a small bear cub swam by me on a quiet lake during another immersive video, it felt so real that it reminded me of an experience with a loved one who recently passed away. I couldn’t wipe the tears inside my headset.
Apple unveiled the device, its most ambitious and riskiest new hardware offering in years, at a developer event earlier in the day. The headset blends both virtual reality and augmented reality, a technology that overlays virtual images on live video of the real world. At the event, Apple CEO Tim Cook touted the Vision Pro as a “revolutionary product,” with the potential to change how users interact with technology, each other and the world around them. He called it “the first product you look through, not at.”
But it’s clearly a work in progress. The apps and experiences remain limited; users must stay tethered to a battery pack the size of an iPhone with just two hours of battery life; and the first minutes using the device can be off-putting. Apple also plans to charge $3,499 for the device when it goes on sale early next year – more than had been rumored and far more than other headsets on the market that have previously struggled to gain wide adoption.
With its loyal following and impressive track record on hardware, Apple may be able to convince developers, early adopters and some enterprise customers to pay up for the device. But if it wants to attract a more mainstream audience, it will need a “killer app,” as the industry often refers to it -— or several.
Based on my demo, Apple still has a long way to go, but it’s off to a compelling start.
A dedicated building and an optometrist
Hours after the keynote event, I arrived at a building on Apple’s sprawling Cupertino, California, campus specifically constructed to stage demos and briefings for the new headset.
I was met by an Apple employee who scanned my face to help customize the fit of the headset. Then I entered a small room where an optometrist asked if I wore glasses or corrective lenses. I had gotten Lasik surgery years ago, but others around me had their glasses scanned so the headset could present their specific prescription. It’s an incredible feat that differentiates Apple from competitors and ensures no frames need to be squeezed into the headset. But it’s unclear how the company plans to handle this process at scale if millions buy the device.
The initial setup process was somewhat unpleasant: I felt a little nauseous and claustrophobic as I adjusted to the device. It tracked my eyes, scanned my hands and mapped the room to better tailor the augmented reality experience.
But Apple has also taken steps to reduce the motion sickness problem that has plagued other headsets. The headset uses an R1 processor, a custom chip that cuts down on the latency issue found in similar products that can result in nausea.
As many viewers were quick to point out on Monday, the headset itself looks like a pair of designer ski goggles. It features a soft adjustable strap on the top, a “digital crown” on the back – a bigger version than what you’d find on an Apple Watch – and another digital crown on the top that serves as a kind of home button. There’s also a wire connecting to an external battery pack.
The headset itself felt light enough in the beginning, but even with Apple’s considerable design chops, I never shook the idea that there was a computer on my face. Fortunately, unlike other computing products, the headset did remain cool on my face throughout the experience, thanks largely to a quiet fan and airflow running through the system
Unlike other headsets, the new mixed reality headset also displays the eyes of its users on the outside, so “you’re never isolated from the people around you, you can see them and they can see you,” Alan Dye, vice president of human interface, said during the keynote.
Sadly, I never got to see how my own eyes or anyone else’s looked through the headset during the demo.
A mixed experience
After putting on the device, I saw an iOS-like interface. I could easily hop in and out of apps, such as Messages, FaceTime, Safari and Photos, using just my eye movements and touching my thumb and pointer finger together to act as the “select” button. This was more intuitive than expected and worked even when my hands rested on my lap.
Some app experiences were better than others, however. It was beautiful to see images in the Photos app presented before me in a larger than life manner, but it’s hard to imagine feeling the need to do this often on a couch back home. Vision Pro also offers a spatial photo option, which lets users view images and videos in 3D so you feel like you’re directly in the scene. Again, cool but unnecessary.
During another demo, an Apple employee wearing a Vision Pro headset FaceTimed me from the other side of campus. Her “persona” – a digital representation which did not show her wearing the Vision Pro – appeared in front of me as we chatted about the event earlier in the day. She seemed real but it was clear she was not; she was a sort of pseudo-human. (Apple did not scan my face to create my own persona, which would otherwise be done through its OpticID security feature during the setup phase.)
The Apple employee then shared a virtual whiteboard – dragging, dropping and highlighting interior design images. Cook has focused on AR’s potential to foster collaboration, and it’s clear how this tool could be used in meetings to fulfill that promise. What’s less clear is why most employers would spend $3,499 per device per employee to make this happen rather than simply use Zoom.
Like so much else about the product unveiling, this pitch felt mistimed. Earlier in the pandemic, more people might have jumped at the chance to create these virtual experiences while we worked and socialized almost entirely from home. Now, with more employees back in the office and companies looking to cut costs amid broader economic uncertainty, the justification for this pricey device seemed less clear.
The real magic of the Vision Pro, however, is in the immersive videos. Watching an underwater scene from Avatar 2 in 3D, for example, was surreal, seemingly placing me right in the ocean with these fictional creatures. It’s easy to imagine buy-in from Hollywood filmmakers to create experiences just for the headset.
Apple is also uniquely positioned here to supercharge the device with these experiences. It has close relationships in the entertainment industry, including with former Apple board member and Disney CEO Bob Iger, who announced in a pre-recorded video during the event that Disney+ will be available on the headset at launch. Apple teased new National Geographic, Marvel and ESPN experiences for the headset, too.
Almost every new Apple product, from the iPhone to the Apple Watch, promises to use screens of varying sizes to change how we live, work and interact with the world. The Vision Pro has the potential to do all of that in an even more striking way. But unlike the first time I picked up an iPhone or a smartatch, after 30 minutes of using Vision Pro, I was very content to put it down and return to the real world.