Apple revealed a handful of fresh innovations for its hardware that are pushing hard on the accessibility front, capabilities that will arrive later in 2022 through software updates.
New software features, which were developed using machine learning, include door detection for blind or visually impaired users, as well as a live captioning system for those who are deaf or hard of hearing, and Apple Watch Mirroring which gives people with physical and motor impairments the ability to control the smartwatch via an iPhone.
Let’s take a closer look at these features, starting with Door Detection, which, as the name suggests, allows iPhone and iPad users to locate a door when they arrive in a new location.
The feature, which uses LiDAR so will require a device equipped with the LiDAR scanner (iPhone 12 and iPhone 13 – Pro and Pro Max handsets – and various iPad Pro models), is built into the Magnifier app. It can determine if a door is open or closed, and if the latter, how it can be opened, as well as the user’s distance from the door, and it can read any signs or characters on the door (such as a number).
The magnifier will get a new detection mode which will house the door detection function and also offer functions like people detection and image descriptions (to describe the user’s surroundings).
People who are deaf or hard of hearing will have access to live captions on iPhone, iPad and Mac computers, allowing captions (with adjustable font size) to be generated on-device for everything from video chatting to watching streaming content. In FaceTime, captions are automatically assigned to the relevant person speaking in the call, and on Mac, users have the ability to type in responses and have them speak out loud in real time.
Device support caveats include that only Macs with Apple chips are supported, or you’ll need an iPhone 11 or later, or in the case of the iPad, models with the chip. A12 Bionic (or later). Initially, Live Captions will debut in beta form (so still in testing – Apple notes that caption accuracy “may vary”) only for the English language (US and Canada).
Apple’s latest major revelation in accessibility is the introduction of Apple Watch Mirroring, which allows users to use their iPhone to control the watch. In other words, users can take advantage of smartphone assistive features like voice control and switch control when interacting with their Apple Watch, unlocking capabilities like voice commands for the watch, tracking head, etc.
The new Quick Actions with Apple Watch also allow users to use simple hand gestures for commands, such as answering (or ending) a phone call using a double pinch gesture.
Note that you will need an Apple Watch Series 6 or newer to benefit from the mirroring function.
Analysis: more to come, including VoiceOver redesign and friend controllers
There’s a lot of thoughtful stuff here, and more to push accessibility even further.
For example, Apple has also been busy adding support for a bunch of new languages (over 20 of them) for VoiceOver, its screen reader tool (with dozens of new voices also on the way). implementation).
There’s also an incoming Siri Pause Time feature, so people with speech impairments can extend how long Siri waits before responding to a request, and Buddy Controllers, through which a friend can be invited to help. the user to play a game, essentially leaving both controllers working to direct the action in a single-player title.
As a reminder, this will all be coming to Apple devices later this year via software updates. Also, keep in mind that Apple advises against using features such as door detection and live captions in “high risk” or emergency situations, and in the case of early , where there could be a risk of injury to the user.