Apple has updated its devices with a rash of new accessibility features across its operating systems, making them easier to operate for people with special needs.
Live Captions is coming to Apple devices, including the iPhone, iPad, and Mac. This feature, which the company will soon start beta testing, will transcribe any audio content on the device, including FaceTime calls, video conferencing on compatible apps (each speaker will be identified), video streaming, and in-person conversations. Live Captions is available in English.
For context, Google has focused on Live Captions since Android 10, which is available on its Pixel devices and other select Android phones. Users can transcribe some languages apart from English on the Pixel 6 and Pixel 6 Pro. So, basically, Apple is catching up in that area.
Apple says all the transcription will be done on the device, meaning your data is not being processed or stored anywhere else. This is how Google’s implementation works too.
The beta program will start in the US and Canada later this year. iPhones from 11 up can use Live Captions. This will be followed by iPads with at least an A12 Bionic CPU and Macs equipped with Apple Silicon CPUs.
The Apple Watch is not getting left out, as Apple will expand the Assistive Touch controls using gestures. The feature arrived last year, but Apple is adding double pinching to stop a call, dismiss notifications, take a picture, pause or play media or begin a workout session.
Apple has added some features to the Apple Watch to make it easier to use for people living with physical and motor disabilities. The device can now be controlled from an iPhone paired to it using a new mirroring feature. The feature is basically a modified form of AirPlay, allowing you to access the features on the watch from a larger screen.
Apple is also updating the Sound Recognition feature it released with iOS 14. This can identify sounds like a smoke alarm or running water, which should help people with hearing disabilities. The new update will allow customizing the sound recognition. It will listen for a particular sound in a specific area and raise notifications based on it.
There are also updates for Apple’s VoiceOver app for reading the screen, Speak Selection, and Speak Screen. They will gain support for 20 more locales and languages, including Arabic (World), Basque, Bengali (India), Bhojpuri (India), Bulgarian, Catalan, Croatian, Farsi, French (Belgium), Galician, Kannada, Malay, Mandarin (Liaoning, Shaanxi, Sichuan), Marathi, Shanghainese (China), Spanish (Chile), Slovenian, Tamil, Telugu, Ukrainian, Valencian, and Vietnamese.
Apple also revealed the iPhone and iPad would use lidar sensors and cameras for Door Detection, all processed on the device itself. With this feature, you can find entryways at a new location and locate them. It will also tell you if the door has a knob or if it is open or closed.
Apple said customers can get more information about these updates and more at Apple Stores. They can also get demos to get the hang of them.