Apple’s Live Captions is Part of Larger Accessibility Update
The Executive Headlines
In view of the Global Accessibility Awareness Day (May 19th), Apple is announcing assistive updates in honor of the occasion. The company is bringing new features across iPhone, iPad, Mac and Apple Watch, and the most intriguing of the lot is systemwide Apple’s Live Captions.
Apple’s Live Captions
Similar to Google's implementation on Android, Apple’s Live Captions will transcribe audio playing on your iPhone, iPad or Mac in real-time, displaying subtitles onscreen. It will also caption sounds around you, so you can use it to follow along with conversations in the real world. You’ll be able to adjust the size and position of the caption box, and also choose different font sizes for the words. The transcription is generated on-device, too.
But unlike on Android, Apple’s Captions on FaceTime calls will also clearly distinguish between speakers, using icons and names for attribution of what's being said. Plus, those using Macs will be able to type a response and have it spoken aloud in real-time for others in the conversation. Apple’s Live Captions will be available as a beta in English for those in the US and Canada.
Updating Existing Recognition Tools
Apple is also updating its existing sound recognition tool, which lets iPhones continuously listen out for noises like alarms, sirens, doorbells or crying babies. With a coming update, users will be able to train their iPhones or iPads to listen for custom sounds, like your washing machine’s “I'm done” song or your pet duck quacking, perhaps. A new feature called Siri Pause Time will also let you extend the assistant's wait time when you're responding or asking for something, so you can take your time to finish saying what you need.
The company is updating its Magnifier app that helps people who are visually impaired better interact with people and objects around them. Expanding on a previous People Detection tool that told users how far away others around them were, Apple is adding a new Door Detection feature. This will use the iPhone's LiDAR and camera to not only locate and identify doors, but will also read out text or symbols on display, like hours of operation and signs depicting restrooms or accessible entrances.