Apple launched a set of accessibility features on Tuesday targeted at assisting people with disabilities. The new capabilities, which will be available later this year on the iPhone, Apple Watch, and Mac, are said to leverage hardware, software, and machine learning breakthroughs to assist those with low vision, vision impairment, and physical or movement limitations. Door Detection for iPhone and iPad users, Apple Watch Mirroring, and live captioning are among the highlights. VoiceOver has been updated with 20 new regions and languages, according to Apple.
Door Detection, which employs the LiDAR sensor on the latest iPhone or iPad models to help users navigate to a door, is one of the most useful accessibility tools that Apple released as part of its recent updates. According to the business, the function employs a combination of LiDAR, camera, and on-device machine learning to determine how far users are from the door and characterise its properties, such as whether it is open or closed.
The Door Detection feature can assist individuals in opening a closed door by pushing, turning a knob, or pulling a handle. It’s also said to be able to read signs and symbols near the door, such as the room number, and even recognise the existence of an accessible entrance signal.
The pre-installed Magnifier software will enable the Door Detection feature, which will operate with the iPhone 13 Pro, iPhone 13 Pro Max, iPhone 12 Pro, iPhone 12 Pro Max, iPad Pro 11-inch (2020), iPad Pro 11-inch (2021), and iPad Pro 12.9-inch (2020) and iPad Pro 12.9-inch (2021).
The Door Detection feature will be accessible through a new Detection Mode in Apple’s Magnifier app. People Detection and Image Descriptions will be two new capabilities that can function alone or in tandem with Door Detection to aid people who are visually impaired or have low vision.
Apple Maps will have audible and tactile feedback for users who have enabled VoiceOver to help them identify the beginning point for walking direction, in addition to the Magnifier enhancements, the company revealed.
Apple Watch owners will be able to control their smartwatch remotely using their associated iPhone thanks to special Apple Watch Mirroring capabilities. Users will be able to control Apple Watch with the iPhone’s assistive functions, such as Voice Control and Switch Control. Instead of tapping the Apple Watch display, users can utilise voice commands, sound actions, head tracking, and even external Made for iPhone switches.
People with physical and motor limitations will benefit from all of this.
Apple Watch Mirroring, according to Apple, combines hardware and software integration on the system, as well as AirPlay advances, to allow customers to use capabilities like Blood Oxygen and Heart Rate tracking, as well as the Mindfulness app. The Apple Watch Series 6 and later devices will support the mirroring feature.
Double-pinch gesture capability will also be available for Apple Watch users. Users will be able to utilise the double-pinch gesture to answer or finish a phone call, dismiss a notice, take a photo, play or pause media in the Now Playing app, and start, halt, or continue a workout. It will work with Apple Watch AssistiveTouch.
Apple unveiled Live Captions on the iPhone, iPad, and Mac for deaf users and those with hearing difficulties. It will be available in English in beta later this year for iPhone 11 and later, iPad models with A12 Bionic and later, and Macs with Apple silicon users in the United States and Canada.
According to the company, Live Captions will function with any audio content, including phone and FaceTime calls, video conferencing or social media apps, streaming media content, and even when users are conversing with someone nearby.
Users can change the text size to make it easier to read. FaceTime will now assign auto-transcribed language to call participants, making it easier for users with hearing impairments to converse with one another during video chats.
According to Apple, Live Captions on Mac will include the ability to enter a response and have it read out in real time to other participants in the conversation. It also stated that Live Captions would be generated on the device, with user privacy and security in mind.
Speak Selection and Speak Screen will be able to use the new languages, locations, and voices. Additionally, VoiceOver for Mac will function in conjunction with the new Text Checker tool to correct formatting issues such as duplicated spaces and misplaced capital letters.
To commemorate Global Accessibility Awareness Day this week, Apple added some new accessibility features. These features include Siri Pause Time, which allows users to customise how long the voice assistant waits before responding to a request, Buddy Control, which allows users to ask a caregiver or friend to play a game, and customisable Sound Recognition, which is said to recognise sounds unique to a person’s environment, such as their home’s unique alarm, doorbell, or appliances.
New themes and customisation options, such as bolding text and altering line, character, and word spacing, will be included in the preloaded Apple Books app to provide consumers with a more accessible reading experience. In addition, starting this week, a new Accessibility Assistant shortcut in the Shortcuts app for Mac and Apple Watch will help recommend accessibility features based on user preferences.
Park Access for All, a new resource from the National Park Foundation, will be available on Apple Maps to assist people locate accessible attractions, programmes, and services to explore in parks around the United States. It will also recognise companies and organisations who cherish, embrace, and prioritise the Deaf population and sign languages.
Users can also find accessibility-focused apps and tales from developers in the App Store, as well as stories by and about people with disabilities in Apple Books’ Transforming Our World collection. The Saylists playlists, which each focus on a distinct sound, will also be highlighted on Apple Music.
Similarly, the Apple TV app will include the most recent successful movies and shows with genuine representations of individuals with disabilities.
Users will also have access to guest-curated collections from notable members of the accessibility community, such as Marlee Matlin (“CODA”), Lauren Ridloff (“Eternals”), Selma Blair (“Introducing, Selma Blair”), and Ali Stroker (“Christmas Ever After”), to name a few.
This week, Apple Fitness+ will introduce trainer Bakari Williams, who will use American Sign Language (ASL) to highlight features such as Audio Hints, which are short descriptive verbal cues to assist visually impaired or low vision users, and Time to Walk and Time to Run episodes becoming “Time to Walk or Push” and “Time to Run or Push” for wheelchair users.
Every workout and meditation on Apple Fitness+ will support ASL, and all videos will include closed captioning in six languages. Trainers will also demonstrate adaptations in each workout to help those with disabilities participate.
Apple is also launching SignTime, which will connect consumers of the Apple Store and Apple Support with on-demand ASL interpreters. Customers in the United States who use ASL, the United Kingdom who use British Sign Language (BSL), and France who use French Sign Language can already use SignTime (LSF). Furthermore, Apple Store locations across the world have begun delivering live sessions throughout the week to help customers learn about iPhone accessibility capabilities, and Apple Support social channels are highlighting how-to information, according to the company.
Aryan Jakhar is pursuing Bachelor’s degree in Journalism and Mass Communication. He is a passionate blogger, who understands the power of words.
Aryan is currently working as editor-in-chief at TheShiningMedia.in and he is reachable on [email protected]