In an announcement on Wednesday, Apple unveiled novel accessibility functionalities for its iPhone and iPad, slated for release later this year.
The corporation disclosed that the Eye Tracking feature aims to empower users with physical impairments to manipulate their iPad or iPhone using ocular movements. “We persistently advance technological boundaries, and these fresh functionalities underscore our enduring pledge to furnish the utmost experience to all our users,” articulated Tim Cook, Apple’s CEO.
Music Haptics will furnish an alternative method for individuals who are deaf or experience hearing challenges to engage with music via the Taptic Engine integrated into the iPhone. Meanwhile, Vocal Shortcuts will enable users to execute tasks by emitting personalized auditory signals. Moreover, Vehicle Motion Cues are poised to mitigate motion-induced queasiness when employing an iPhone or iPad within a moving vehicle. Apple outlined that additional accessibility attributes are slated to debut on visionOS.
Eye Tracking on iPad and iPhone
Apple elucidated that Eye Tracking furnishes a built-in avenue for navigating iPad and iPhone interfaces solely through ocular movements. Tailored for individuals with physical limitations, Eye Tracking harnesses the front-facing camera for rapid setup and calibration. With on-device machine learning, all data employed for configuring and operating this functionality remains securely stored on the device, with no data shared externally.
Music Haptics
Music Haptics presents a fresh approach for individuals who are hearing-impaired to engage with music on the iPhone. Upon activation, the Taptic Engine within the iPhone emits discernible taps, textures, and nuanced vibrations synchronized with the music. This feature is compatible across the extensive Apple Music library and will be accessible as an API for developers to integrate into their applications.
Other Accessibility Features
The Vocal Shortcuts functionality empowers users to execute tasks by emitting custom auditory cues, while Vehicle Motion Cues aim to alleviate motion discomfort during vehicle usage of an iPhone or iPad. Another addition, Listen for Atypical Speech, augments speech recognition capabilities for a broader spectrum of speech patterns, leveraging on-device machine learning.
Catering to individuals with acquired or progressive speech-affecting conditions like cerebral palsy, ALS, or stroke, these features introduce heightened customization and control, expanding on functionalities introduced in previous iOS iterations for individuals who are nonverbal or at risk of losing speech capabilities.
Additional Features on VisionOS
Additionally, Apple disclosed forthcoming accessibility enhancements for visionOS, encompassing systemwide Live Captions to facilitate comprehension of spoken dialogue in live discussions and audio content, benefiting users who are deaf or hard of hearing.
With Live Captions for FaceTime on visionOS, a broader user demographic will be able to seamlessly engage in the unique experience of connecting and collaborating via the platform. Apple Vision Pro will further enable the relocation of captions during immersive video experiences, alongside expanded support for Made for iPhone hearing devices and cochlear implants.