I just came out of a long demo session with Apple’s new $3,499 Vision Pro headphones, which the company announced at WWDC 2023 as “the world’s most advanced consumer electronics device.” It’s…a really, really cool VR headset with awesome screens and video passthrough. And I mean incredibly Impressive displays and video passthrough: I happily used my phone to take notes while wearing the Vision Pro, which no other headset can reasonably allow.
That said, while Apple would obviously prefer people to think of the Vision Pro as a “powerful space computer” or augmented reality device, there’s really no getting around the essential VR headset nature of the thing, down to the adjustable headbands which definitely ruined my hair. It looks, feels and behaves like a VR headset. If you’ve used a Meta Quest, just imagine the best possible Meta Quest running something a lot like iPadOS, and you’ll get it.
Apple held Vision Pro demos in a large white cube-shaped building it built for WWDC called the Fieldhouse. Upon entry, I was handed an iPhone for a quick setup process: a rotational scan of your face in a circle (much like the Face ID setup which determined the size of face mask to use), then another side-scan of the side face looking at my ears to calibrate spatial audio. After that, Apple took me to a “vision specialist” who asked me if I wore glasses – I wore my contacts, but glasses wearers had a quick prescription check so Apple could equip Vision Pros with the appropriate lenses. (The lenses are made by Zeiss; Apple needed a partner who could legally sell prescription lenses. They click into place magnetically and will be sold separately at launch.)
The headset itself weighs just under a pound – it’s connected by a braided white power cable to a silver battery that provides around two hours of use. The cable detaches from the headset with a mechanical latch, but is permanently connected to the battery. If you want to plug into the wall, plug a USB-C adapter into the battery.
The design language is all brushed aluminum, shiny glass and soft fabrics; the vibe is closer to the iPhone 6 than the iPhone 14. That glass on the front is an obviously complex piece of optical engineering: it’s perfectly curved but still serves as a proper lens for the cameras and l OLED screen that shows your eyes when you’re looking at people. (This feature is called EyeSight; I didn’t get to try it out in any way.)
Surrounding the headset itself are 12 cameras, a LIDAR sensor and a TrueDepth camera, plus IR illuminators to ensure the cameras can see your hands in dark environments for monitoring purposes. It all runs on a combination of Apple’s M2 processors and new R1 processors, which unsurprisingly generate a good amount of heat. The Vision Pro removes this heat by drawing air in from the bottom of the device and exhausting it out the top.
The top of the Vision Pro has a button on the left that acts as a shutter button for taking 3D videos and photos, which I couldn’t try. The digital crown is on the right; clicking it brings up the home screen of app icons, while turning it changes the level of VR immersion in certain modes. I asked why anyone would to want to set the immersion level anywhere other than on or off, and it looks like Apple envisions the intermediate immersion setting as a sort of adjustable desktop workspace for apps while leaving the sides open so you can talk to your colleagues.
When you put the headset on there’s a quick automatic eye adjustment that’s much quicker and more seamless than on something like the Quest Pro – there are no manual dials or sliders for eye adjustments. Apple wouldn’t say anything specific about its field of view long before launch, but I definitely saw black in my peripheral vision. The Vision Pro isn’t as fully immersive as the marketing videos would have you believe.
The screen itself is absolutely insane: a 4K screen for each eye, with pixels of just 23 microns. In the short time I tried it, it was totally usable for reading text in Safari (I loaded The edge, of course), looking at photos and watching movies. It’s easily the highest resolution VR screen I’ve ever seen. There was some green and purple fringing around the edges of the lenses, but I can’t say for sure if that was due to the quick assembly or the early demo nature of the device or something else. We’ll have to see when it actually ships.
The video passthrough was equally impressive. It appeared lag-free and was crisp, clean and clear. I happily talked to others, walked around the room, and even took notes on my phone while wearing the headset – something I could never do with something like the Meta Quest Pro. That said, it’s still a video passthrough. I could sometimes see quite intense compression and loss of detail as people’s faces moved around in the shadows. I could see the infrared light on the front of my iPhone flashing unnecessarily as it tried to unlock with FaceID to no avail. And the display was darker than the room itself, so when I took the headphones off, my eyes had to adjust to the actual brightness of the room.
Similarly, Apple’s ability to do mixed reality is truly impressive. At some point in full VR Avatar demo I raised my hands to gesture towards something, and the headset automatically detected my hands and superimposed them on the screen, then noticed that I was talking to someone and made them appear as well . Reader, I gasped. Apple also went a lot further with eye tracking and gesture control: eye tracking was pretty solid, and those infrared illuminators and side cameras mean you can tap your thumb and index finger together to select things while they’re on your lap or by your side. You don’t need to point fingers at anything. It’s really cool.
Apple has clearly solved a bunch of big hardware interaction issues with VR headsets, mostly by developing and spending everyone else who tried. But that absolutely didn’t really answer the question of what those things really are. For however: the main interface is largely a grid of icons, and most of the demos were basically giant screen projections with very familiar apps on them. Safari. Pictures. Movies. The Freeform collaboration app. FaceTime video calls. There was a demo with 3D dinosaurs where a butterfly landed on my outstretched hand, but it was so much “augmented reality” that I really experienced. (Yes, mapping the room and projecting screens is a very complex AR job, but there really wasn’t a measurement app after years of ARKit demos at WWDC. That was weird.)
I was able to see a quick FaceTime call with someone else in a Vision Pro using an AI-generated 3D “persona” (Apple doesn’t like it when you call them “avatars”) that was at both impressive and profoundly strange. It was immediately apparent that I was talking to a character in a strange way, especially since most of the person’s face was frozen except for their mouth and eyes. But even that was compelling after a while, and certainly much nicer than your average Zoom call. You set up a character by holding the helmet in front of you and letting it scan your face, but I haven’t been able to set one up myself and there’s clearly still a lot of refinement to come, so I’ll withhold judgment until ’til later.
It was all basically a reel of the greatest hits of VR demos, including a few old eves: Apple showed 180-degree 3D videos with spatial sound in something called the Apple Immersive Video Format, which the company apparently filmed with proprietary cameras, he may or may not release. (They looked like the 3D videos we’ve seen in VR demos forever.) I watched a 3D photo of cute kids taken by the headset cameras and watched a 3D video of those kids blowing out a candle. anniversary. (Same.) I did a minute-long mindfulness meditation in which a voice commanded me to be grateful as the room darkened and a sphere of colored triangles extended all around me. (It looked awesome, but Supernatural exists, has millions of users on the quest, and has been offering guided meditation since 2020.) And I watched Avatar in what looked like a movie theater, which, well, it’s one of the oldest VR demos ever.
Has it all been done better by the extremely superior Vision Pro hardware? Without question. But has more been done convincing? I don’t know, and I’m not sure I can tell by wearing the helmet for a short time. I TO DO know that wearing this thing felt oddly lonely. How do I watch a movie with other people in a Vision Pro? And if you want to collaborate with people in the room with you And people on FaceTime? What does it mean that Apple wants you to wear a helmet at your a child’s birthday? There are just more questions than answers here, and some of those questions touch on the very nature of what it means for our lives to be literally mediated by screens.
I also know that Apple still has a long list of things it wants to refine by next year when the Vision Pro ships. That’s part of the reason it’s being announced at WWDC: to allow developers to react to it, figure out what kinds of apps they might build, and start using them. But it’s the same promise we’ve been hearing for years about VR headsets, Meta and others. Apple can clearly leapfrog everyone in the industry when it comes to hardware, especially when cost is apparently not an issue. But the most perfect headphone demo reel ever is still just a headphone demo reel – the question of whether Apple’s famed developer community can generate a killer app for the Vision Pro. is still pending.