Apple Vision Pro – with 1% vision

I got a chance to try out the Apple Vision Pro for an hour. Having a vision impairment myself, I obviously dove straight into the accessibility settings to find out if this new, hyped technology would work for me. Let’s go through how it went.

Daniel using Apple Vision Pro in his #A11Y hoodie.

I was worried

I usually trust Apple devices will work great for me. For instance, when the Apple Watch first came out, I preordered it without hesitating at all. And sure enough, it came loaded with VoiceOver, zoom and other features, so it worked great for me.

But with the Apple Vision Pro, I worried. Not that it wouldn’t include the accessibility features I’m used to, but because of the foundation of the technology: it records eye-movements and makes the image clear where you’re looking.

For me, that could be a problem. I have peripheral vision. So I don’t see at all in the middle of my field of vision. Here’s a simulation of that:

Blurry peripherals around a white cloud with red and blue dots scattered in it.

So I “aim my eyes” to the side of what I want to see, to see it as clearly as possible. If the Apple Vision Pro makes the peripheral stuff too fuzzy, then I’d probably not be able to use it efficiently at all.

Onboarding – great first impression

You can trust Apple to include accessibility right out of the box. No surprises there. On the first screen you get to, there’s a prompt to tripple click the digital crown to activate accessibility features. I immediately got a short tutorial to the main VoiceOver gestures:

  • Pinch left thumb and index finger to click
  • Right thumb and index finger to step forward
  • Right thumb and middle finger to move back

With this, I could independently read the first few screens with instructions. I was most interested in activating zoom, but figured that I’d do that once I got into the main interface.

The eye setup – unsuccessful but not a blocker

The main step of the set up process consisted of having me look at a dot moving around clockwise in front of me. Since I had VoiceOver turned on, the experience was a bit different than the “regular” one.

In the regular onboarding, it shows six dots at once and asks the user look and tap for each dot.

Six dots around the message "Select more dots to refine your eye setup".

It does this in three rounds, with lighter and lighter conditions.

For me, with VoiceOver turned on, it only showed one dot at a time and asked me to just look at it, not tap it. After a second or so it moved to the next position.

Here’s a low-quality video of me doing the eye setup, with my colleague holding up an iPhone so you might be able to see what’s going on. Also, kids running around in the background, a great illustration of situational cognitive load..!

Round 1 was on a dark background, but then it got brighter and brighter. It felt like I could do the first round quite well, but the third, brightest round was almost impossible for me. I also didn’t get any audio feedback when the dot moved, which would have helped. Something like: “Dot position at 12 o’clock. Dot moved to position at 2 o’clock.”

And sure enough, I got an error message “Eye setup unsuccessful”. Tried it again, where my colleague tried to help by telling me when and where the dots moved to, but with the same result.

Phone screen saying "Eye setup unsuccessful" with two buttons "Skip eye setup" and "Retry eye setup".

Luckily, there was a “Skip eye setup”, so I pressed that. And found that it worked fairly well for me without the calibration.

It made me a bit disappointed though, and I felt unsure how much this would affect me moving forward.

Accessibility settings to the rescue

Getting into the main screen felt nice. I had gotten the hang of the basic VoiceOver gestures and could quite easily navigate to Settings and Accessibility. This experience was very consistent to iOS and MacOS.

I found settings for Zoom, activated full screen zooming instead of window zooming, which is how I have it set up on my other devices.

To activate and adjust zooming, I press the physical button on the top left of the device and rotated the digital crown.

I found that the maximum default zoom level was too zoomed out. So I needed to activate VoiceOver and try to increase the zoom level. Which was a slider, and I didn’t instinctively understand how to increase that slider. So had to go online on a smartphone and read up on that. Finally I figured out some way to do it. It wasn’t intuitive to me, though.

So with the maximum zoom set to 10x instead of 5x, I could much more easily use the interface.

But it was still difficult to read a lot of the text in the settings screen. So I increased the text sizes (dynamic type). Then I found the increase contrast feature which was a game changer for me. Making things more “dark mode” helped a lot.

With these settings I could start the dinousaur experience that you probably heard of, use zoom in that to get a great experience!

Overall

My feelings after having used the Apple Vision Pro for about an hour were optimistic.

Even though the eye setup failed, it didn’t seem to affect my experience much.

I think after getting used to the gestures and interface it would become quite a nice tool for me. Especially once I get the hang of scaling “screens” in the interface larger or smaller, to immerse myself in them, that could work really well for me.

Like for anyone, it’s going to require getting used to the new gestures and functionality. Doing this with accessibility features turned on adds an extra layer to this learning process.

It all reminded me of when I first got my iPhone 3GS, the first iPhone with VoiceOver on it. There were a lot of gestures and fundamentals to learn and it took some time. But we all know how that turned out, I use my phone everyday with gestures that now feels like second nature. I’m optimistic that I one day will feel the same about Apple Vision Pro!

Get notified when we write new stuff

About once a month we write an article about accessibility or usability, that's just as awesome as this one! #HumbleBrag

Simply drop your email below.

Hör av dig och berätta hur vi kan hjälpa dig.
hello@axesslab.com