New iOS 17 Accessibility features: Assistive Access, Personal Voice, and Live Speech

iOS 17 Accessibility features

While we all are pumped up for WWDC 2023, Apple has surprised us by sharing a preview of the new accessibility features of iOS 17. The announcement was more significant as it was just before the Global Accessibility Awareness Day.

Apple has given fans an introduction to some of the new features that will be rolled out later this year. So, before we get hands-on experience, let’s get an overview of the new iOS 17 accessibility features. 

Apple previews new iOS 17 accessibility features

Apple’s main motive has constantly been improving people’s lives, so it has developed accessibility features that help a wide range of impaired users. The new features focus on cognitive, vision, hearing, and mobility accessibility, incorporating on-device machine learning.

Apple CEO Tim Cook highlighted, “Today, we’re excited to share incredible new features that build on our long history of making technology accessible so that everyone has the opportunity to create, communicate, and do what they love.”

Let’s delve into the upcoming iPhone features and learn what Apple offers.

1. Assistive Access

Assistive Access in iOS 17
Image credit: Apple

Assistive Access is the feature for cognitive accessibility that will allow users to use the iPhone and iPad more easily and independently. It will streamline the apps’ interface and highlight essential elements, reducing cognitive load. Apple has worked closely with cognitively disabled users to design this feature accurately.  

All essential apps like Camera, Photos, Music, Calls, and Messages will have high-contrast buttons, large text labels, and customizable options for individual preferences. So, whether you prefer a visual, grid-based layout or a text-based interface, Assistive Access will offer a personalized experience to enhance ease of use and independence. 

Moreover, Apple has combined Phone and FaceTime into the Calls app for easy access. Users with a cognitive disability can interact visually thanks to the emoji-only keyboard and the option to record a video message to send to loved ones in the Messages app.

2. Live Speech and Personal Voice Advance Speech Accessibility

Live Speech and Personal Voice Advance Speech Accessibility in iOS 17
Image credit: Apple

Apple has announced the Live Speech and Personal Voice feature for speech accessibility to help users who are at risk of losing their ability to speak; for instance, those suffering from ALS (amyotrophic lateral sclerosis).

Using Live Speech, you can type what you want to say during calls and FaceTime, and your iPhone will speak that out loud for accessible communication. Users can also save commonly used phrases for quick access during conversations. 

On the other hand, the Personal Voice feature enables you to create a synthesized voice that sounds like you. You will be required to read along with text prompts and record 15 minutes of audio on your iPhone. After that, this voice will be integrated with Live Speech. And as always, Apple ensures the privacy of all conversations.

3. Point and Speak

Magnifier’s Detection Mode will get a new feature for vision accessibility called Point and Speak. It’s designed for users who are blind or have limited vision. When you point at any text, your iPhone will recognize that and read it aloud to assist you in interacting with real-life objects with text labels. 

Point and Speak use data from the Camera app, the LiDAR Scanner, and on-device machine learning.  It will be compatible with VoiceOver and may be used with other Magnifier features like People Detection, Door Detection, and Image Descriptions to help disabled users to roam around. 

Apple accessibility Magnifier Point and Speak in iOS 17
Image credit: Apple

Cheers to empowerment!

The upcoming features in iOS 17 reflect Apple’s commitment to inclusivity and empowerment of disabled users worldwide. Apple’s collaboration with disability communities ensures these features address real-life challenges while on-device machine learning protects user privacy. I am too excited about testing the new features and experiencing how they will create milestones. What about you?

Explore more…

Author Profile

Ava is an enthusiastic consumer tech writer coming from a technical background. She loves to explore and research new Apple products & accessories and help readers easily decode the tech. Along with studying, her weekend plan includes binge-watching anime.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Total
0
Share