Apple previews new iPhone and Mac accessibility features that could seriously change lives
Apple has revealed a clutch of fresh innovations for its hardware which pushes forward strongly on the accessibility front, capabilities that’ll arrive later in 2022 courtesy of software updates.
The new software features, which have been developed using machine learning, include Door Detection for blind or low vision users, as well as a system of Live Captions for the deaf or those with hearing loss, and Apple Watch Mirroring which gives those with physical and motor disabilities the ability to control the smartwatch via an iPhone.
Let’s take a more in-depth look at these capabilities, starting with Door Detection, which as the name suggests allows for iPhone and iPad users to locate a door when arriving at a new place.
The feature, which uses LiDAR, so will require a device equipped with the LiDAR Scanner (iPhone 12 and iPhone 13 – both Pro and Pro Max – handsets, and various iPad Pro models), is built into the Magnifier app. It can ascertain whether a door is open or closed, and if the latter, how it can be opened, as well as the user’s distance from the door, plus it can read any signs or characters on the door (like a number).
Magnifier will get a new Detection Mode which will play host to the Door Detection feature, and will also offer the likes of People Detection and image descriptions (for describing the surroundings of the user).
Those who are deaf or hard of hearing will get access to Live Captions on iPhone, iPad, and Mac computers, allowing for captions (with adjustable font size) to be generated on-device for everything from video chatting to watching streaming content. In FaceTime, the captions are automatically attributed to the relevant person speaking on the call, and on Mac, users have the option to type responses and have them spoken aloud in real-time.
The caveats for device support include that only Macs with Apple chips are supported, or you’ll need an iPhone 11 or better, or in the case of the iPad, models with the A12 Bionic chip (or later). Initially, Live Captions will debut in beta form (so still in testing – Apple observes that the accuracy of the captions ‘may vary’) for just the English language (US and Canada).
The final major accessibility revelation from Apple here is the introduction of Apple Watch Mirroring, which allows for people to use their iPhone to control the watch. In other words, users can benefit from the smartphone’s assistive features such as Voice Control and Switch Control when interacting with their Apple Watch, opening up abilities like voice commands for the watch, head tracking and so forth.
New Quick Actions with the Apple Watch also let users employ simple hand gestures for controls, such as answering (or ending) a phone call by using a double-pinch gesture.
Note that you’ll need an Apple Watch Series 6 or newer to benefit from the mirroring function.
Analysis: More to come including VoiceOver revamp and Buddy Controllers
There’s a lot of well thought out stuff here, and more besides which is coming to push forward even further with accessibility.
For example, Apple has also been busy with adding support for a bunch of new languages (over 20 of them) for VoiceOver, its screen reader tool (with dozens of new voices being implemented, too).
There’s also an incoming Siri Pause Time feature, so those with speech disabilities can extend the length of time Siri waits before responding to a request, and Buddy Controllers, whereby a friend can be invited in to help the user play a game, basically letting both controllers work to direct the action in a single-player title.
As a reminder, all this stuff will be coming to Apple devices later this year via software updates. Furthermore, bear in mind that Apple does advise that features such as Door Detection and Live Captions should not be used in ‘high-risk’ or emergency situations, and in the case of the former, where there might be a danger of injury to the user.