Apple plans to equip its iPhones and iPads in the coming months with additional features that make it easier to use for people with physical and cognitive disabilities. That has been announced by the company.
Other large tech companies have also been trying for years to make their devices as usable as possible for people with a physical disability. Apple has been leading by example here for years – the innovations the company is announcing for the coming months show just how much know-how goes into these accessibility features.
For the first time, Apple also offers something for people with a cognitive disability: Assistive Access. This feature offers a highly simplified layout, limiting you to only the most important functions: music, phone, messages, photos, and the camera. Third party apps like YouTube or Candy Crush can also be added.
The apps themselves are also designed to be very simple and easy to use. The messaging app can be used to communicate through text, large emojis or small selfie videos.
Live Speech and Personal Voice should also be very useful for people with speech disabilities. Live Speech translates entered text into natural speech.
The feature is built into the Phone app or Face-Time, so an audio call is also possible this way, even if the person on the call can’t talk. Frequently used phrases and sentences can be saved so that you can speak them directly if necessary.
People at risk of losing their voice will soon be able to train their iPhone to the sound of their own voice. The feature is called Personal Voice and requires users to speak predefined phrases into the iPhone for about 15 minutes.
The speech output of Live Speech will then sound with your own voice. A demo of the feature we were able to listen to already sounded pretty good – while it was still clearly audible that it was an artificially generated voice – it’s certainly not suitable for fooling biometric security features.
By the way, the sound of your own voice remains on the device you trained it on and does not end up on an Apple server. It can only be transferred via iCloud to other devices in the Apple account if you actively consent to this.
In addition, Apple is introducing a hugely useful feature for people with low or severely limited vision in its iOS mobile operating system: Detection Mode. This is an extension of the Lupe app.
This is already built into iOS and offers users all kinds of options to enlarge or display their environment with a higher contrast or certain color filters. Detection Mode now aims to alleviate one of the biggest daily challenges for visually impaired people: device control panels without tactile buttons.
All types of touch displays cannot be operated by blind people without help – or only with great difficulty: If they can remember the position of the individual functions and buttons with tactile switches, a touch display is just a large, undefined smooth surface.
Detection mode works like this: the user stands in front of a control panel, something like a microwave oven with a touch screen, enters detection mode, and with one hand aligns the iPhone so that the camera points at the control panel. With the other hand, the user now runs his finger over the control panel and over what appears to be a button.
The iPhone now recognizes which button the finger is over and then reads the corresponding label aloud. If the user moves the finger further and stops, the iPhone reads the label there. In this way, the operation should also work without sight.
So far there is only one video of the feature. How well it works in everyday life should then be checked during the year. By the way, this will only work with the iPhone Pro 12, Pro 13 and Pro 14, as it also uses the LiDAR scanner for detection, which is only installed in these devices, according to Apple’s description.
Apple also announced other minor accessibility features. When exactly they will be delivered is not known. For example, it would be conceivable together with the new iOS version in the fall.
All information about Apple’s accessibility features can also be found at apple.com/de/accessibility on Apple’s website.
(t-online/dsc)
source: watson

I’m Maxine Reitz, a journalist and news writer at 24 Instant News. I specialize in health-related topics and have written hundreds of articles on the subject. My work has been featured in leading publications such as The New York Times, The Guardian, and Healthline. As an experienced professional in the industry, I have consistently demonstrated an ability to develop compelling stories that engage readers.