Apple Vision Pro hands-on, again, for the first time

It’s a slushy, snowy day in New York City — the kind where I’d normally look out the window and declare a work-from-home day. But today, I hauled myself out into the wet cold because Apple had offered Nilay Patel and me demo time with the Apple Vision Pro.

Nilay and a few others have already spent time with the Vision Pro, but for everyone else, the $3,499 Vision Pro has been shrouded in mystery. But my half-hour with it revealed that Apple’s headset felt more familiar than I thought it would. The iPhone face scan to select the correct light seal is very similar to setting up Face ID. Slipping it onto your head isn’t that different from any number of other VR headsets, like the Meta Quest line — the design and fabric headband are just more Apple-y. And like any other VR headset, you feel it sitting on your head and wrecking your hairdo once you slip it on. (If you’ve got long hair like me, you’ll feel it bunch up in the back, too.)

You can see the shimmer on the OLED display that lets you know I’m not seeing you.
Photo: Apple

After the headset is situated, there’s a brief setup for eye tracking — look at various dots and tap your fingers together — and then visionOS drops you into the app launcher, which looks a lot like Launchpad on a Mac. The only difference is you can still see the room around you, if you so choose. On the upper right-hand side, there’s a digital crown — something I’m well acquainted with as an Apple Watch user. You use it to recenter your homescreen or immerse yourself in a virtual environment. The opposite side has a button for taking spatial photos and video. This, too, looks like the side button on an Apple Watch.

As in previous demos, vision tracking was fast and accurate. Looking at a menu item or button immediately highlighted it. Movie titles highlighted in the Apple TV app as my eyes roved over them. Apple had us open the virtual keyboard in Safari to browse to a website, and it worked, albeit clunkily: you look at a letter and pinch your fingers to select it. You can type as fast as your eyes can move and fingers can pinch, which means it’s much easier to dictate to Siri.

I had a little trouble with the pinch and double-pinch gestures at first because I was apparently holding the pinches too long when trying to select. It wasn’t until I was told to lightly tap and let go — the same action as double tap on the Apple Watch — that it all started to click.

The demo we received was similar to what Nilay got to experience at WWDC, with a few additions here and there. Even so, reading about Nilay’s experience a few months ago and then actually getting to see it myself were two very different things. I’ve read how bonkers the screen is. But even knowing that, my eyes weren’t really prepared for two 4K screens blasting 23-micron pixels into each eyeball. I had to remind myself to blink lest my eyes dry out.

The virtual world inside the Vision Pro feels like a higher-resolution version of what Meta is trying to accomplish with the Quest but with a vastly more powerful M2-based computer to use inside. It’s neat that I can throw an app over to my upper right so I can look up at the ceiling and view photos if I want. It’s fun to rip the tires off an AR Alfa Romeo F1 car in JigSpace. There is a certain novelty to opening up the Disney Plus app to watch a Star Wars trailer in a virtual environment that looks like Tatooine. I did, in fact, flinch when a T. rex made eye contact with me. A virtual environment of the Haleakalā volcano surprised me because the texture of the rocks looked quite lifelike. This is all familiar stuff. It’s just done well, and done with no lag whatsoever.

Nilay, contemplating the Vision Pro.
Photo: Apple

Apple had us bring some of our own spatial videos and panoramic photos to look at inside the Vision Pro, and the effect was convincing, although it works best when the camera is held still. Nilay had shot some spatial videos where he’d intentionally moved the camera to follow his kid around the zoo and felt some familiar VR motion queasiness. Apple says it’s doing everything it can to reduce that, but it’s clear some shots will work better in spatial than others — like any other camera system, really. We’ll have to keep playing with this outside of Apple’s carefully controlled environment to really figure out its limits.

Apple keeps emphasizing that the Vision Pro isn’t meant to isolate you from the rest of the world, and the display on the front of the headset is designed to keep you connected to others. So we got to see a demo of EyeSight — what an onlooker would see on that front display when looking at someone wearing the Vision Pro. It’s a bit goofy, but you can see the wearer’s eyes, part of what Apple calls a “persona.” (We were not able to set up our own personas, sadly.) When Apple’s Vision Pro demo person blinked, we saw a virtual version of their eyes blink. When they were looking at an app, a bluish light appeared to indicate their attention was elsewhere. And when they went into a full virtual environment, the screen turned into an opaque shimmer. If you started talking to them while they were watching a movie, their virtual ghost eyes would appear before you. And when they took a spatial photo, you’d see the screen flash like a shutter. 

This is all well and good, but it’s strange to wear the headset and not actually know what’s happening on that front display — to not really have a sense of your appearance. And it’s even stranger that looking at people in the real world can cause them to appear, apparition-like, in the virtual world. The social cues of this thing are going to take a long while to sort out. Admittedly, it was all a whirlwind. I spent a half-hour like a kid gawping at an alien planet — even though I’d never left the couch. But by the end of my demo, I started to feel the weight of the headset bring me back to the real world. I’d been furrowing my brow, concentrating so hard, I felt the beginnings of a mild headache. That tension dissipated as soon as I took the headset off, but walking back out into Manhattan, I kept replaying the demo over in my head. I know what I just saw. I’m just still trying to see where it fits in the real world.

Read the full article Here

Leave a Reply

Your email address will not be published. Required fields are marked *

DON’T MISS OUT!
Subscribe To Newsletter
Be the first to get latest updates and exclusive content straight to your email inbox.
Stay Updated
Give it a try, you can unsubscribe anytime.
close-link