Update 6/5: Added that Vision Pro can be controlled with one’s eyes.
Follow them closely enough and you’ll see Apple has a penchant for proclaiming every WWDC its “biggest ever”—until the next year.
As a grizzled veteran of eleven WWDCs in my career (not to mention countless other events), the vibe I felt following the keynote yesterday at Apple Park in Cupertino certainly lived up to the company’s marketing hype. After the customary lowdown on the new operating systems coming to devices later this year, the company ended the anticipatory lull by announcing an all-new product and corresponding platform. Witnessing such a moment doesn’t come often, so to say WWDC 2023 was “the biggest ever” seems not the least bit blustery or hyperbolic. The advent of the Apple Vision Pro and visionOS indeed is a very big deal.
The lead time for actually getting the headset in our hands (on our face?) is as long as the $3500 price tag is high. At first blush, however, the device sure looks like something a lot of nerdy early adopters will be recklessly wrecking the house looking for spare change to buy the thing. Between now and early next year, there’s plenty of time to ponder the merits of the device. Most importantly to me, of course, is accessibility.
As wearables go, it’s one thing to put computers in your ears (AirPods) and on your wrist (Apple Watch). Putting one on your face (Vision Pro) is orders of magnitude different. It’s an entirely unique beast. From a disability perspective, that’s why ruminating on the Vision Pro is going to be so fascinating in the coming months. There are considerations here, both functionally and ergonomically, that will redefine what it means for so-called “spatial computing” to be accessible and inclusive.
Here are preliminary—yet educated—thoughts on accessibility and the Vision Pro.
The Existential Question
In the wake of its reveal barely 24 hours ago, the tweets and initial impressions videos I’ve seen from my comrades in the tech media have wondering the obvious: “The Vision Pro is oozing coolness, but why should people buy one?” I’d go further and put it this way: “What does the Vision Pro do that could be more accessible in mixed reality?”
In a disability context, this is not a trivial matter. For such an expensive piece of kit—and make no mistake, disabled people are by and large not going to be able to afford the Vision Pro—the reality is there has to be a use case or two so compelling that someone with a disability would find a way to get one. Having not tried the product yet, it’s hard to say. Will the mindfulness feature be a boon for those in the neurodivergent community? Will watching movies and television shows be easier to see for those with low vision? Will FaceTime and other videoconferencing be better for the Deaf and hard-of-hearing communities? Apple must show something here, because to use the Vision Pro has to be about more than mere immersion. There has to be a place where spatial computing makes certain tasks easier and more accessible than using an iPad or MacBook. This is table stakes for the disability community; the OS can be accessible (more on that below), but there’s more to it. Apple must show the headset is better (read: more accessible) than a phone or tablet or laptop in the same way Steve Jobs pitched the original iPad as being much better at doing certain jobs than an iPhone or Mac.
Effective Ergonomics
Beyond the Vision Pro’s prime objective and its software, another aspect of the device’s usability lies in ergonomics. It’s not something to take lightly; all the cutting-edge technology in the world and its 5000 patents does you no good if you struggle to get it on and off. If the sentiment sounds familiar, it should. It’s the same question I had regarding the bands after Apple announced Apple Watch and watchOS nearly a decade ago. The conceit is the same now as it was then: All Apple Watch’s benefits are negated if one can’t get it on their wrist. I wrote about this for MacStories shortly after the Watch started shipping in 2015. In the years since, I’ve found Apple’s don’t-call-it-Velcro-but it’s-Velcro Sport Loops to be my favorite as much for accessibility’s sake as fashion’s. As a person with limited fine-motor skills, they’re easy to get on and off and are infinitely adjustable. That heightens the overall experience for me.
For Vision Pro, it’s worth thinking about how accessible it’ll be to get the thing on and off. The majority of my friends and peers in the media will take this step for granted, but it’s a crucial detail. The stretchiness of the headband, the prescription lens system, and other parts of the hardware all contribute to usability. To Apple’s credit, the zeal with which they’ve used magnets in their industrial designs everywhere is a stroke of genius, accessibility-wise. It’ll go a long way in making the Vision Pro’s modularity in terms of the light seal and prescription lens much more approachable for people with less-than-stellar motor abilities. Another thing to watch out for is how the headset feels while wearing it. Is it heavy, like AirPods Max? What does it feel like inside the headset itself? Both will be key for those with sensitivity to certain sensory experiences.
Almost the Same Song and Dance
As you’d expect, the Vision Pro’s operating system, visionOS, is promised to be fully accessible. As you’d also expect, details are scant at this point. According to the State of the Union, visionOS will include the usual suspects in terms of accessibility features. Among them are VoiceOver, Voice Control, Dwell Control, Dynamic Type, and several more. Regarding interaction, Apple’s designed Vision Pro to work entirely by eye-tracking, voice, and gestures. Is there a feature similar to Siri Pause Time for stutterers? AssistiveTouch is there as a way to enable control for those who can’t perform gestures, but are there other affordances? Likewise, for cognition, how does Apple take care of people who may have trouble with the abstractness of computing in thin air, as it were? AirPods and Apple Watch are concrete in ways Vision Pro is not.
Although visionOS shares a strong family resemblance in terms of accessibility, the implementation will be very different. To reiterate a point I made at the outset, it isn’t a matter of whether the Vision Pro is accessible. It’s a matter of the headset being an entirely new piece of technology for your face that’s entirely different territory than your ears or wrist. Take Optic ID, for instance. Will Apple’s technology be able to accommodate for conditions like strabismus, which I have, where one (or both) of the eyes are not set straight? Given the wide variety of visual needs, it’s plausible that Vision Pro may be exclusionary insofar as what Apple’s tech can adeptly handle right now. It’s akin to Siri not being great at understanding atypical speech; the tech just isn’t there yet, but is being worked on. These points are nuanced, but what’s ultimately important is it’ll be interesting to learn what is (and isn’t) in Apple’s technological ken with regards to scanning a smorgasbord of eyeballs.
Cautious Optimism Abounds
It should be obvious by now that, unlike some of my friends, I didn’t get a carefully choreographed demo following the keynote like Joanna Stern, iJustine, or MKBHD. This isn’t a complaint, rather a statement of fact. It’s important context because, although I did see the Vision Pro up close in the hands-on area, for the time being I can only speculate on accessibility matters based on what I know about Apple in general and what I know of my lifelong lived experience with multiple disabilities.
To reiterate once more what I wrote in the lede, the excitement of an entirely new device powered by an entirely new platform is tempered by the unknown. There are a lot of questions for anyone, but they’re especially pressing for members of the disability community. As I said, it’s one thing to have computers in your ears or on your wrist. It’s an entirely other world to have one on your face, even for short bursts.
After the keynote ended, the buzz amongst people I spoke with was palpable. The prevailing wisdom was trite but true: Apple’s headset is undoubtedly a 1.0. It’s young and will be underdeveloped in myriad ways, but the potential is there and the sky’s the limit in terms of future improvements. This obviously applies as much to accessibility as anything else. One thing should be made abundantly clear, however. Apple designed and developed Vision Pro to be as accessible as possible from the very beginning. The company deserves kudos for that fact alone in an industry where accessibility is more often than not “bolted on.”
One thing I can say with complete certainty: Whatever happens next with Vision Pro over the course of the next several months, the news is guaranteed not to be boring. We’re truly at the precipice of a new era.
Read the full article here