Apple IOS eye tracking will let you control an iPhone with your eyes

New features coming to Apple IOS 18 later this year will allow users to control their iPhone or iPad with their eyes and offer a new way for deaf or hard of hearing users to experience music. 

According to a blog from Apple, the new accessibility features will be released in coming months, but the company didn’t give a specific date when FOX TV Stations inquired. 

"These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world," Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives, said in the blog post. 

RELATED: Technology crushing human creativity? Apple’s new iPad ad has struck a nerve online

Here’s a look at what the new features can do. 

Apple Eye Tracking

Apple’s Eye Tracking is designed for people with physical disabilities. It uses AI and the front-facing camera to calibrate in seconds to allow users to navigate their iPad and iPhone with just their eyes.

iphone eye tracking feature

Apples new accessibility features are designed for people with disabilities (Photo by Silas Stein/picture alliance via Getty Images)

The feature won’t require additional hardware or accessories, Apple says. 

"Users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes," Apple says. 

Apple Music Haptics 

Music Haptics will allow people who are deaf or hard of hearing to experience music. The Taptic Engine in the iPhone will play taps, textures and vibrations tuned to music’s audio. It will work with millions of songs in Apple Music and will also be made available for developers to make music more accessible in their own apps.

RELATED: Samsung overtakes Apple for top phonemaker spot: report

New Apple features for people with ALS, cerebral palsy

Apple’s new Vocal Shortcuts will allow users to assign custom sounds that Siri can use to launch shortcuts and complete other tasks. 

Listen for Atypical Speech uses on-device machine learning to recognize user’s speech patterns to give them a new level of control. It’s designed for people with conditions that affect speech, like cerebral palsy, amyotrophic lateral sclerosis (ALS) or stroke.

AppleTechnology