You’ll soon be able to control iPhone, iPad with your eyes, Apple says. Here’s how


Apple announced new accessibility features that give a sneak peak into the tech giant’s AI plans. These include Eye Tracking which uses artificial intelligence (AI) that can help users control their iPhone and iPad using only their eyes. The feature has been designed especially for people with physical disabilities and setting it up will be quick and easy as users just need to use front-facing camera for a few seconds to calibrate it, Apple said.

The feature can be used on both iPadOS and iOS, and it doesn’t need any extra hardware or accessories, Apple informed.

The feature can be used on both iPadOS and iOS, and it doesn’t need any extra hardware or accessories, Apple informed. Using Eye Tracking, you can move around in apps, activate different parts using Dwell Control, press buttons, swipe and use gestures as well.

Unlock exclusive access to the latest news on India’s general elections, only on the HT App. Download Now! Download Now!

Apple also announced another feature called Listen for Atypical Speech which uses on-device machine learning to help Siri understand a larger range of voices.

Apple said, “These features combine the power of Apple hardware and software, harnessing Apple silicon, artificial intelligence, and machine learning to further Apple’s decades-long commitment to designing products for everyone.” These features will be available “later this year” most likely with the iOS 18 and iPadOS 18 fall updates, the company informed.

Leave a Comment