Apple announces new accessibility features, including Eye Tracking (2024)

opens in new window

PRESS RELEASE May 15, 2024

Coming later this year, Apple’s new accessibility features include Eye Tracking, a way for users to navigate iPad and iPhone with just their eyes.

CUPERTINO, CALIFORNIAAppletoday announced new accessibility features coming later this year, including Eye Tracking, a way for users with physical disabilities to control iPad or iPhone with their eyes. Additionally, Music Haptics will offer a new way for users who are deaf or hard of hearing to experience music using the Taptic Engine in iPhone; Vocal Shortcuts will allow users to perform tasks by making a custom sound; Vehicle Motion Cues can help reduce motion sickness when using iPhone or iPad in a moving vehicle; and more accessibility features will come to visionOS. These features combine the power of Apple hardware and software, harnessing Apple silicon, artificial intelligence, and machine learning to further Apple’s decades-long commitment to designing products for everyone.

“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”

“Each year, we break new ground when it comes to accessibility,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”

Eye Tracking Comes to iPad and iPhone

Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.

Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

Music Haptics Makes Songs More Accessible

Music Haptics is a new way for users who are deaf or hard of hearing to experience music on iPhone. With this accessibility feature turned on, the Taptic Engine in iPhone plays taps, textures, and refined vibrations to the audio of the music. Music Haptics works across millions of songs in the Apple Music catalog, and will be available as an API for developers to make music more accessible in their apps.

Music Haptics is a new way for users who are deaf or hard of hearing to experience music on iPhone.

New Features for a Wide Range of Speech

With Vocal Shortcuts, iPhone and iPad users can assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks. Listen for Atypical Speech, another new feature, gives users an option for enhancing speech recognition for a wider range of speech. Listen for Atypical Speech uses on-device machine learning to recognize user speech patterns. Designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, these features provide a new level of customization and control, building on features introduced in iOS 17 for users who are nonspeaking or at risk of losing their ability to speak.

“Artificial intelligence has the potential to improve speech recognition for millions of people with atypical speech, so we are thrilled that Apple is bringing these new accessibility features to consumers,”said Mark Hasegawa-Johnson, the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign’s principal investigator.“The Speech Accessibility Project was designed as a broad-based, community-supported effort to help companies and universities make speech recognition more robust and effective, and Apple is among the accessibility advocates who made the Speech Accessibility Project possible.”

Vehicle Motion Cues Can Help Reduce Motion Sickness

Vehicle Motion Cues is a new experience for iPhone and iPad that can help reduce motion sickness for passengers in moving vehicles. Research shows that motion sickness is commonly caused by a sensory conflict between what a person sees and what they feel, which can prevent some users from comfortably using iPhone or iPad while riding in a moving vehicle. With Vehicle Motion Cues, animated dots on the edges of the screen represent changes in vehicle motion to help reduce sensory conflict without interfering with the main content. Using sensors built into iPhone and iPad, Vehicle Motion Cues recognizes when a user is in a moving vehicle and responds accordingly. The feature can be set to show automatically on iPhone, or can be turned on and off in Control Center.

Vehicle Motion Cues is a new experience for iPhone and iPad that can help reduce motion sickness for passengers in moving vehicles.

CarPlay Gets Voice Control, More Accessibility Updates

Accessibility features coming to CarPlay include Voice Control, Color Filters, and Sound Recognition. With Voice Control, users can navigate CarPlay and control apps with just their voice. With Sound Recognition, drivers or passengers who are deaf or hard of hearing can turn on alerts to be notified of car horns and sirens. For users who are colorblind, Color Filters make the CarPlay interface visually easier to use, with additional visual accessibility features including Bold Text.

Updates to CarPlay include Sound Recognition, which allows drivers or passengers who are deaf or hard of hearing to turn on alerts to be notified of car horns and sirens.

Accessibility Features Coming to visionOS

This year, accessibility features coming to visionOS will include systemwide Live Captions to help everyone — including users who are deaf or hard of hearing — follow along with spoken dialogue in live conversations and in audio from apps. With Live Captions for FaceTime in visionOS, more users can easily enjoy the unique experience of connecting and collaborating using their Persona. Apple Vision Pro will add the capability to move captions using the window bar during Apple Immersive Video, as well as support for additional Made for iPhone hearing devices and cochlear hearing processors. Updates for vision accessibility will include the addition of Reduce Transparency, Smart Invert, and Dim Flashing Lights for users who have low vision, or those who want to avoid bright lights and frequent flashing.

visionOS will offer Live Captions, so users who are deaf or hard of hearing can follow along with spoken dialogue in live conversations and in audio from apps.

These features join the dozens of accessibility features already available in Apple Vision Pro, which offers a flexible input system and an intuitive interface designed with a wide range of users in mind. Features such as VoiceOver, Zoom, and Color Filters can also provide users who are blind or have low vision access to spatial computing, while features such as Guided Access can support users with cognitive disabilities. Users can control Vision Pro with any combination of their eyes, hands, or voice, with accessibility features including Switch Control, Sound Actions, and Dwell Control that can also help those with physical disabilities.

“Apple Vision Pro is without a doubt the most accessible technology I’ve ever used,” said Ryan Hudson-Peralta, a Detroit-based product designer, accessibility consultant, and cofounder of Equal Accessibility LLC. “As someone born without hands and unable to walk, I know the world was not designed with me in mind, so it’s been incredible to see that visionOS just works. It’s a testament to the power and importance of accessible and inclusive design.”

Additional Updates

  • For users who are blind or have low vision, VoiceOver will include new voices, a flexible Voice Rotor, custom volume control, and the ability to customize VoiceOver keyboard shortcuts on Mac.
  • Magnifier will offer a new Reader Mode and the option to easily launch Detection Mode with the Action button.
  • Braille users will get a new way to start and stay in Braille Screen Input for faster control and text editing; Japanese language availability for Braille Screen Input; support for multi-line braille with Dot Pad; and the option to choose different input and output tables.
  • For users with low vision, Hover Typing shows larger text when typing in a text field, and in a user’s preferred font and color.
  • For users at risk of losing their ability to speak, Personal Voice will be available in Mandarin Chinese. Users who have difficulty pronouncing or reading full sentences will be able to create a Personal Voice using shortened phrases.
  • For users who are nonspeaking, Live Speech will include categories and simultaneous compatibility with Live Captions.
  • For users with physical disabilities, Virtual Trackpad for AssistiveTouch allows users to control their device using a small region of the screen as a resizable trackpad.
  • Switch Control will include the option to use the cameras in iPhone and iPad to recognize finger-tap gestures as switches.
  • Voice Control will offer support for custom vocabularies and complex words.

Celebrate Global Accessibility Awareness Day with Apple

This week, Apple is introducing new features, curated collections, and more in celebration of Global Accessibility Awareness Day:

  • Throughout the month of May, select Apple Store locations will host free sessions to help customers explore and discover accessibility features built into the products they love. Apple Piazza Liberty in Milan will feature the talent behind “Assume that I can,” the viral campaign for World Down Syndrome Day. And available year-round at Apple Store locations globally, Today at Apple group reservations are a place where friends, families, schools, and community groups can learn about accessibility features together.
  • Shortcuts adds Calming Sounds, which plays ambient soundscapes to minimize distractions, helping users focus or rest.
  • Visit the App Store to discover incredible apps and games that promote access and inclusion for all, including the accessible App Store Award-winning game Unpacking, apps as tools for augmentative and alternative communication (AAC), and more.
  • The Apple TV app will honor trailblazing creators, performers, and activists who passionately share the experiences of people with disabilities. This year’s theme is Remaking the World, and each story invites viewers to envision a reality where everyone is empowered to add their voice to the greater human story.
  • Apple Books will spotlight lived experiences of disability through curated collections of first-person narratives by disabled writers in ebook and audiobook formats.
  • Apple Fitness+ workouts, meditations, and trainer tips welcome users who are deaf or hard of hearing with American Sign Language, and Time to Walk now includes transcripts in the Apple Podcasts app. Fitness+ workouts always include Audio Hints to support users who are blind or have low vision, as well as modifiers so that users of all levels can participate.
  • Users can visit Apple Support to learn how their Apple devices can be customized using built-in accessibility features. From adapting the gestures to customizing how information is presented on a device’s screen, the Apple Accessibility playlist will help users learn how to personalize Apple Vision Pro, iPhone, iPad, Apple Watch, and Mac to work best for them.
  • Text of this article

    May 15, 2024

    PRESS RELEASE

    Apple announces new accessibility features, including Eye Tracking, Music Haptics, and Vocal Shortcuts

    CUPERTINO, CALIFORNIAAppletoday announced new accessibility features coming later this year, including Eye Tracking, a way for users with physical disabilities to control iPad or iPhone with their eyes. Additionally, Music Haptics will offer a new way for users who are deaf or hard of hearing to experience music using the Taptic Engine in iPhone; Vocal Shortcuts will allow users to perform tasks by making a custom sound; Vehicle Motion Cues can help reduce motion sickness when using iPhone or iPad in a moving vehicle; and more accessibility features will come to visionOS. These features combine the power of Apple hardware and software, harnessing Apple silicon, artificial intelligence, and machine learning to further Apple’s decades-long commitment to designing products for everyone.

    “We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”

    “Each year, we break new ground when it comes to accessibility,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”

    Eye Tracking Comes to iPad and iPhone

    Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.

    Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

    Music Haptics Makes Songs More Accessible

    Music Haptics is a new way for users who are deaf or hard of hearing to experience music on iPhone. With this accessibility feature turned on, the Taptic Engine in iPhone plays taps, textures, and refined vibrations to the audio of the music. Music Haptics works across millions of songs in the Apple Music catalog, and will be available as an API for developers to make music more accessible in their apps.

    New Features for a Wide Range of Speech

    With Vocal Shortcuts, iPhone and iPad users can assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks. Listen for Atypical Speech, another new feature, gives users an option for enhancing speech recognition for a wider range of speech. Listen for Atypical Speech uses on-device machine learning to recognize user speech patterns. Designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, these features provide a new level of customization and control, building on features introduced in iOS 17 for users who are nonspeaking or at risk of losing their ability to speak.

    “Artificial intelligence has the potential to improve speech recognition for millions of people with atypical speech, so we are thrilled that Apple is bringing these new accessibility features to consumers,”said Mark Hasegawa-Johnson, the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign’s principal investigator.“The Speech Accessibility Project was designed as a broad-based, community-supported effort to help companies and universities make speech recognition more robust and effective, and Apple is among the accessibility advocates who made the Speech Accessibility Project possible.”

    Vehicle Motion Cues Can Help Reduce Motion Sickness

    Vehicle Motion Cues is a new experience for iPhone and iPad that can help reduce motion sickness for passengers in moving vehicles. Research shows that motion sickness is commonly caused by a sensory conflict between what a person sees and what they feel, which can prevent some users from comfortably using iPhone or iPad while riding in a moving vehicle. With Vehicle Motion Cues, animated dots on the edges of the screen represent changes in vehicle motion to help reduce sensory conflict without interfering with the main content. Using sensors built into iPhone and iPad, Vehicle Motion Cues recognizes when a user is in a moving vehicle and responds accordingly. The feature can be set to show automatically on iPhone, or can be turned on and off in Control Center.

    CarPlay Gets Voice Control, More Accessibility Updates

    Accessibility features coming to CarPlay include Voice Control, Color Filters, and Sound Recognition. With Voice Control, users can navigate CarPlay and control apps with just their voice. With Sound Recognition, drivers or passengers who are deaf or hard of hearing can turn on alerts to be notified of car horns and sirens. For users who are colorblind, Color Filters make the CarPlay interface visually easier to use, with additional visual accessibility features including Bold Text.

    Accessibility Features Coming to visionOS

    This year, accessibility features coming to visionOS will include systemwide Live Captions to help everyone — including users who are deaf or hard of hearing — follow along with spoken dialogue in live conversations and in audio from apps. With Live Captions for FaceTime in visionOS, more users can easily enjoy the unique experience of connecting and collaborating using their Persona. Apple Vision Pro will add the capability to move captions using the window bar during Apple Immersive Video, as well as support for additional Made for iPhone hearing devices and cochlear hearing processors. Updates for vision accessibility will include the addition of Reduce Transparency, Smart Invert, and Dim Flashing Lights for users who have low vision, or those who want to avoid bright lights and frequent flashing.

    These features join the dozens of accessibility features already available in Apple Vision Pro, which offers a flexible input system and an intuitive interface designed with a wide range of users in mind. Features such as VoiceOver, Zoom, and Color Filters can also provide users who are blind or have low vision access to spatial computing, while features such as Guided Access can support users with cognitive disabilities. Users can control Vision Pro with any combination of their eyes, hands, or voice, with accessibility features including Switch Control, Sound Actions, and Dwell Control that can also help those with physical disabilities.

    “Apple Vision Pro is without a doubt the most accessible technology I’ve ever used,” said Ryan Hudson-Peralta, a Detroit-based product designer, accessibility consultant, and cofounder of Equal Accessibility LLC. “As someone born without hands and unable to walk, I know the world was not designed with me in mind, so it’s been incredible to see that visionOS just works. It’s a testament to the power and importance of accessible and inclusive design.”

    Additional Updates

    • For users who are blind or have low vision, VoiceOver will include new voices, a flexible Voice Rotor, custom volume control, and the ability to customize VoiceOver keyboard shortcuts on Mac.
    • Magnifier will offer a new Reader Mode and the option to easily launch Detection Mode with the Action button.
    • Braille users will get a new way to start and stay in Braille Screen Input for faster control and text editing; Japanese language availability for Braille Screen Input; support for multi-line braille with Dot Pad; and the option to choose different input and output tables.
    • For users with low vision, Hover Typing shows larger text when typing in a text field, and in a user’s preferred font and color.
    • For users at risk of losing their ability to speak, Personal Voice will be available in Mandarin Chinese. Users who have difficulty pronouncing or reading full sentences will be able to create a Personal Voice using shortened phrases.
    • For users who are nonspeaking, Live Speech will include categories and simultaneous compatibility with Live Captions.
    • For users with physical disabilities, Virtual Trackpad for AssistiveTouch allows users to control their device using a small region of the screen as a resizable trackpad.
    • Switch Control will include the option to use the cameras in iPhone and iPad to recognize finger-tap gestures as switches.
    • Voice Control will offer support for custom vocabularies and complex words.

    Celebrate Global Accessibility Awareness Day with Apple

    This week, Apple is introducing new features, curated collections, and more in celebration of Global Accessibility Awareness Day:

    • Throughout the month of May, select Apple Store locations will host free sessions to help customers explore and discover accessibility features built into the products they love. Apple Piazza Liberty in Milan will feature the talent behind “Assume that I can,” the viral campaign for World Down Syndrome Day. And available year-round at Apple Store locations globally, Today at Apple group reservations are a place where friends, families, schools, and community groups can learn about accessibility features together.
    • Shortcuts adds Calming Sounds, which plays ambient soundscapes to minimize distractions, helping users focus or rest.
    • Visit the App Store to discover incredible apps and games that promote access and inclusion for all, including the accessible App Store Award-winning game Unpacking, apps as tools for augmentative and alternative communication (AAC), and more.
    • The Apple TV app will honor trailblazing creators, performers, and activists who passionately share the experiences of people with disabilities. This year’s theme is Remaking the World, and each story invites viewers to envision a reality where everyone is empowered to add their voice to the greater human story.
    • Apple Books will spotlight lived experiences of disability through curated collections of first-person narratives by disabled writers in ebook and audiobook formats.
    • Apple Fitness+ workouts, meditations, and trainer tips welcome users who are deaf or hard of hearing with American Sign Language, and Time to Walk now includes transcripts in the Apple Podcasts app. Fitness+ workouts always include Audio Hints to support users who are blind or have low vision, as well as modifiers so that users of all levels can participate.
    • Users can visit Apple Support to learn how their Apple devices can be customized using built-in accessibility features. From adapting the gestures to customizing how information is presented on a device’s screen, the Apple Accessibility playlist will help users learn how to personalize Apple Vision Pro, iPhone, iPad, Apple Watch, and Mac to work best for them.

    About Apple Apple revolutionized personal technology with the introduction of the Macintosh in 1984. Today, Apple leads the world in innovation with iPhone, iPad, Mac, AirPods, Apple Watch, and Apple Vision Pro. Apple’s six software platforms — iOS, iPadOS, macOS, watchOS, visionOS, and tvOS — provide seamless experiences across all Apple devices and empower people with breakthrough services including the App Store, Apple Music, Apple Pay, iCloud, and Apple TV+. Apple’s more than 150,000 employees are dedicated to making the best products on earth and to leaving the world better than we found it.

    Press Contacts

    Will Butler

    Apple

    willbutler@apple.com

    Apple Media Helpline

    media.help@apple.com

    Copy text

  • Images in this article

    Download all images

Press Contacts

Will Butler

Apple

willbutler@apple.com

Apple Media Helpline

media.help@apple.com

Apple announces new accessibility features, including Eye Tracking (2024)

FAQs

Apple announces new accessibility features, including Eye Tracking? ›

Eye Tracking Comes to iPad and iPhone

What is the Apple Eye Tracking feature? ›

iOS 18 eye tracking feature: How to use it
  • Go to Settings.
  • Tap on Accessibility.
  • Tap on Eye Tracking.
  • Toggle on "Eye Tracking," ensuring the slider turns green.
  • You will then get a prompt that says, "Follow the dot with your eyes as it moves around the screen."
6 days ago

What is the new accessibility feature on iPhone? ›

Eye Tracking Comes to iPad and iPhone

With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

Will iOS 18 introduce Eye Tracking? ›

Accessibility updates include Eye Tracking, a built-in option for navigating iPhone with just eyes; Music Haptics, a new way for users who are deaf or hard of hearing to experience music using the Taptic Engine in iPhone; and Vocal Shortcuts that enable users to perform tasks by making a custom sound.

What is the Apple tracking feature? ›

Apple's App Tracking Transparency feature offers Apple users a simple choice, to allow apps to track their activity or not. If you choose to opt out, Apple will prevent the app from accessing identifiers that link your device with your activity on an app, usually shared with advertisers to create targeted ads.

What does eye tracking tell us? ›

What is eye-tracking? Eye-tracking technology helps observe and measure eye movements, pupil dilation, point of gaze, and blinking to see where subjects of a study focus their visual attention, what they engage with, and what they ignore.

Can Apple see me through my camera? ›

Apple designed cameras with privacy in mind, and third-party applications must obtain your consent before accessing Camera.

How do I get rid of accessibility feature? ›

To disable it, follow these steps:
  1. Open the Settings app on your Android device.
  2. Tap on “Accessibility”.
  3. Look for “Accessibility Shortcut” or a similar option.
  4. Tap on the “Shortcut” option and toggle it off.
  5. Confirm if you've successfully disabled it.
Mar 25, 2024

Can blind people use iPhones? ›

How can I use an iPhone if I'm blind or have low vision? An Apple iPhone just like other smartphones are hugely useful for people who are blind or have low vision. It's a pocket computer that speaks to you, it's a magnifier, dictaphone, a GPS and a screen reader.

What does accessibility feature do? ›

Accessibility features are designed to help people with disabilities use technology more easily. For example, a text-to-speech feature may read text out loud for people with limited vision, while a speech-recognition feature allows users with limited mobility to control the computer with their voice.

How do I turn on Eye Tracking? ›

To enable or disable eye tracking for a specific app:
  1. Put on your headset and select Settings from the app panel.
  2. Under Settings select Apps, then select Installed Apps.
  3. After selecting an app, select Permissions.
  4. Select Storage.
  5. Select Device Permissions.
  6. Toggle Eye Tracking on or off.

Can another iPhone see my screen? ›

Screen sharing lets everyone in a FaceTime call look at photos, browse the web, or help each other out.

Can someone track my iPhone without me knowing? ›

Yes, a partner, parent, or even employer can gain remote, real-time access to your iPhone using spying software. Spyware can track your GPS location, record your keypad inputs such as credit card numbers and passwords, and monitor your calls, texts, app usage, emails, voice, and other personal data.

How do you know if someone is tracking you with Apple? ›

Knowing that Apple® has ways for you to know if you are being tracked is great, but you have to set your phone up to get these alerts. To get tracking notification alerts on your iPhone, make sure that you: Go to Settings > Privacy & Security > Location Services, and turn Location Services on.

How does Apple Vision track eyes? ›

Apple Vision Pro's responsive, precision eye tracking is designed to protect your privacy. A high-performance eye tracking system of LEDs and infrared cameras projects invisible light patterns onto each eye, so you can accurately select elements just by looking at them.

What is the difference between gaze detection and eye tracking? ›

Eye gaze detection uses image recognition with deep learning to perform eye tracking. Gaze tracking is typically employed to determine a person's focus of attention. Eye tracking is becoming a very important capability across many domains, including security, psychology, computer vision, and medical diagnosis.

What is the difference between eye tracking and click tracking? ›

We can observe the inaccuracy of eye movements in any eye-tracking recording: Fixations on a given target are located in an area around the target that is about one degree of visual angle. In comparison, Mouse clicks seem to have a higher accuracy than eye fixations.

References

Top Articles
Latest Posts
Recommended Articles
Article information

Author: Virgilio Hermann JD

Last Updated:

Views: 5897

Rating: 4 / 5 (61 voted)

Reviews: 92% of readers found this page helpful

Author information

Name: Virgilio Hermann JD

Birthday: 1997-12-21

Address: 6946 Schoen Cove, Sipesshire, MO 55944

Phone: +3763365785260

Job: Accounting Engineer

Hobby: Web surfing, Rafting, Dowsing, Stand-up comedy, Ghost hunting, Swimming, Amateur radio

Introduction: My name is Virgilio Hermann JD, I am a fine, gifted, beautiful, encouraging, kind, talented, zealous person who loves writing and wants to share my knowledge and understanding with you.