What I learned from the Apple Store’s 30-minute Vision Pro demo

  News
image_pdfimage_print
These mounted displays near the entrance let visitors touch, but not use, a Vision Pro.
Enlarge / These mounted displays near the entrance let visitors touch, but not use, a Vision Pro.
Kyle Orland

For decades now, potential Apple customers have been able to wander in to any Apple Store and get some instant eyes-on and hands-on experience with most of the company’s products. The Apple Vision Pro is an exception to this simple process; the “mixed-reality curious” need to book ahead for a guided, half-hour Vision Pro experience led by an Apple Store employee.

As a long-time veteran of both trade show and retail virtual-reality demos, I was interested to see how Apple would sell the concept of “spatial computing” to members of the public, many of whom have minimal experience with existing VR systems. And as someone who’s been following news and hands-on reports of the Vision Pro’s unique features for months now, I was eager to get a brief glimpse into what all the fuss was about without plunking down at least $3,499 for a unit of my own.

After going through the guided Vision Pro demo at a nearby Apple Store this week, I came away with mixed feelings about how Apple is positioning its new computer interface to the public. While the short demo contained some definite “oh, wow” moments, the device didn’t come with a cohesive story pitching it as Apple’s next big general-use computing platform.

Editor’s Note: This article precludes a series of of in-depth looks at the Apple Vision Pro. We started with the Apple Store experience because that’s where most people will experience it first. Over the coming days, Samuel Axon will be covering long term usage and experience with the device, as he has spent more than a week living and working in the Vision Pro. Stay tuned.

Setup snafus

After arriving a few minutes early for my morning appointment in a sparsely attended Apple Store, I was told to wait by a display of Vision Pro units set on a table near the front. These headsets were secured tightly to their stands, meaning I couldn’t try a unit on or even hold it in my hands while I waited. But I could fondle the Vision Pro’s various buttons and straps while getting a closer look at the hardware (and at a few promotional videos running on nearby iPads).

After a few minutes, an Apple Store employee, who we’ll call Craig, walked over and said with genuine enthusiasm that he was “super excited” to show off the Vision Pro. He guided me to another table, where I sat in a low-backed swivel chair across from another customer who looked a little zoned out as he ran through his own Vision Pro demo.

Craig told me that the Vision Pro was the first time Apple Store employees like him had gotten early hands-on access to a new Apple device well before the public, in order to facilitate the training needed to guide these in-store demos. He said that interest had been steady for the first few days of demos and that, after some initial problems, the store now mostly managed to stay on schedule.

Unfortunately, some of those demo kinks were still present. First, Craig had trouble tracking down the dedicated iPhone used to scan my face and determine the precise Vision Pro light seal fit for my head. After consulting with a fellow employee, they decided to have me download the Apple Store app and use a QR code to reach the face-scanning tool on my own iPhone. (I was a bit surprised this fit scanning hadn’t been offered as part of the process when I signed up for my appointment days earlier.)

It took three full attempts, scanning my face from four angles, before the app managed to spit out the code that Craig needed to send my fit information to the back room. Craig told me that the store had 38 different light seals and 900 corrective lens options sitting back there, ready to be swapped in to ensure maximum comfort for each specific demo.

After a short wait, another employee brought my demo unit out on a round wooden platter that made me feel like I was at a Japanese restaurant. The platter was artistically arranged, from the Solo Knit Band and fuzzy front cover to the gently coiled cord leading to the battery pack sitting in the center. (I never even touched or really noticed the battery pack for the rest of the demo.)

At this point, Craig told me that he would be able to see everything I saw in the Vision Pro, which would stream directly to his iPad. Unfortunately, getting that wireless connection to work took a good five minutes of tapping and tinkering, including removing the Vision Pro’s external battery cord several times.

Once everything was set, Craig gave me a brief primer on the glances and thumb/forefinger taps I would use to select, move, and zoom in on things in the VisionOS interface. “You’re gonna pretend like you’re pulling on a piece of string and then releasing,” he said by way of analogy. “The faster you go, the faster it will scroll, so be mindful of that. Nice and gentle, nice and easy, and things will go smoothly for you.”

Fifteen minutes after my appointed start time, I was finally ready to don the Vision Pro.

A scripted experience

After putting the headset on, my first impression was how heavy and pinchy the Vision Pro was on the bridge of my nose. Thankfully, Craig quickly explained how to tighten the fit with a dial behind my right ear, which helped immediately and immensely. After that, it only took a minute or two to run through some quick calibration of the impressively snappy eye and hand tracking. (“Keep your head nice and still as you do this,” Craig warned me during the process.)

Imagine this but with an Apple Store in the background.
Enlarge / Imagine this but with an Apple Store in the background.
Kyle Orland

As we dove into the demo proper, it quickly became clear that Craig was reading from a prepared script on his iPhone. This was a bit disappointing, as the genuine enthusiasm he had shown in our earlier, informal chat gave way to a dry monotone when delivering obvious marketing lines. “With Apple Vision Pro, you can experience your entire photo library in a brand new way,” he droned. “Right here, we have some beautiful shots, right from iPhone.”

Craig soldiered through the script as I glanced at a few prepared photos and panoramas. “Here we have a beautiful panorama, but we’re going to experience it in a whole new way… as if you were in the exact spot in which it was taken,” Craig said. Then we switched to some spatial photos and videos of a happy family celebrating a birthday and blowing bubbles in the backyard. The actors in the video felt a little stilted, but the sense of three-dimensional “presence” in the high-fidelity video was impressive.

After that, Craig informed me that “with spatial computing, your apps can exist anywhere in your space.” He asked me to turn the digital crown to replace my view of the store around me with a virtual environment of mountains bathed in cool blue twilight. Craig’s script seemed tuned for newcomers who might be freaked out by not seeing the “real world” anymore. “Remember, you’re always in control,” Craig assured me. “You can change it at any time.”

From inside the environment, Craig’s disembodied voice guided me as I opened a few flat app windows, placing them around my space and resizing them as I liked. Rather than letting these sell themselves, though, Craig pointed out how webpages are “super beautiful [and] easy to navigate” on Vision Pro. “As you can also see… text is super sharp, super easy to read. The pictures on the website look stunning.” Craig also really wanted me to know that “over one million iPhone/iPad apps” will work like this on the Vision Pro on day one.

https://arstechnica.com/?p=2001718