- Apple previewed accessibility features for its devices.
- New options will come “later this year”, but not to all models.
The new features are designed to showcase the integration between hardware and software, using machine learning to power the new tools.
“We’re excited to introduce these new features, which combine innovation and creativity from teams across Apple to give users more options to use our products in ways that best suit their needs and lives.” Sarah Herrlinger, senior director of Accessibility Policy and Initiatives.
Live Captions, Apple style
Taking a cue from Google, Apple previewed its own Live Caption feature for iOS, iPadOS, and macOS. But in its own Cupertino way, the texts generated from social media, streaming, FaceTime calls or video conferencing are generated on the device, without sharing data with Apple’s (or Google’s) servers.
Users will be able to customize font size and will work even with group calls, displaying which participant said the transcribed text displayed on the screen. Additionally, Mac users will have the option to reply by typing and using the text-to-speech feature to read it aloud to others in the conversation.
Later this year, Live Captions will be available in English for iPhone 11 or later, Mac PCs with Apple M1 processors, and iPads with Apple A12 SoC or newer. Apple did not announce when the feature will be available in other languages.
Another welcome feature to iPhones and iPads is the new Door Detection tool to help those with vision restrictions. The feature will use the LiDAR camera on compatible devices — iPhone 12 Pro/Pro Max, 13 Pro/Pro Max, iPad Pro (2020 and 2021) — to locate doors and know how far from it the device is.
Apple’s Door Detection will even inform the user whether the door is open, how to open it, and read text like opening times and room number. The new option will be available in the Magnifier tool and can complement options like the People Detection and Image descriptions, besides integrating with Apple Maps.
AirPlay Mirroring for Apple Watch
In a curious twist, Apple Watch users will soon be able to control the smartwatch using their iPhones. More than simply offering a bigger screen, the feature will allow users to use voice commands and even head tracking to control
watchOS will also include the option to trigger actions like answering a message or play media using a new double pinch command on the watch display, without needing to tap the screen.
Other new features
Apple also highlighted a few other features coming soon to its devices:
- More languages for the VoiceOver tool, including Catalan, Ukrainian, Farsi, Mandarin, Bengali, Tamil, Telugu, and more.
- Buddy Controller, to share a game control with another gamepad, similar to the Copilot option on Xbox.
- Siri Pause Time will let users define a wait time for Siri to wait before answering a command.
- Voice Control Spelling mode will enable letter-by-letter dictation support (initially only in English).
- Sound Recognition will be customizable to unique sounds like alarms and doorbells.
- Apple Books will include more rendering options for more accessibly reading experiences.
According to Apple, compatible devices will receive the new features with software updates coming later this year, with no defined schedule as of publishing time.
Did you like Apple’s new features? Do you know or use apps that offer similar options? Feel free to share your tips in the comments below!