When iOS 18 Eye Tracking comes to accessibility features, Apple never fails to impress its followers. The most notable of the new accessibility features included in the most recent iOS 18 is Eye Tracking on the iPhone. Yes, both iPhones and iPads can now use Apple Vision Pro’s most amazing feature. iPhone users can navigate and operate their devices with just their eyes thanks to AI-powered eye tracking. How awesome is that! The function is fantastic in theory, and we tested it extensively in the real world to see if it works flawlessly. Here’s how to enable and utilize iPhone iOS 18 Eye Tracking , whether you need it for accessibility reasons or you just want to check out the new feature. Now let’s get started!
Supported Devices for Eye Tracking
It’s important to note that not all iOS 18-compatible devices support Eye Tracking. You did really hear correctly. Therefore, iOS 18 eye tracking may not work on your iPhone even if you’re using the most recent version of iOS 18 Eye tracking. This is due to the fact that Apple Eye Tracking is limited to iPhone 12 and later models.
The full list of gadgets that can use iOS 18 Eye Tracking is as follows:
- iPhone 12 and iPhone 12 Mini
- Apple iPhone 12 Pro and Max
- iPhone 13 and iPhone 13 Mini
- The 13 Pro and 13 Pro Max iPhones
- iPhone 14 Pro, iPhone 14 Max, iPhone 14 Plus, and iPhone 14 Pro
- iPhone 14 Plus, iPhone 15
- The 15 Pro and 15 Pro Max iPhones
- Apple iPhone 16, 16 Plus, 16 Pro, and 16 Pro Max
- iPad (10th generation)
- iPad Air (M2)
- iPad Air (3rd generation and up)
- iPad Pro (M4)
- 9-inch iPad Pro (5th generation or later)
- iPad Pro 11-inch (3rd generation or later)
- iPad mini (6th generation)
In iOS 18, turn on eye tracking
It’s rather easy to enable the iOS 18 Eye Tracking feature. It only takes a few minutes to set up, too. These are the actions to follow:
- Go to the Accessibility section of the Settings app after opening it.
- Swipe down to choose Eye Tracking from the Physical and Motor section.
- To set up eye tracking on your iPhone, turn on the Eye Tracking toggle on the next screen and follow the on-screen instructions. All you need to do is focus on each colored dot individually.
- You must set up Eye Tracking with your iPhone or iPad on a solid surface, about 1.5 feet away from your face, for optimal results. Additionally, try not to blink while doing this.
When you’re finished, remember to perform the following:
- Activate Dwell Control. This will enable you to tap on the highlighted object simply by looking at it for a little period of time.
- Don’t disable the Snap to Item feature. This is a crucial setting that encloses the object you’re presently seeing in a box.
- Setting a timer for when the pointer shows again is possible using the Auto Hide option. It is set at 0.50 seconds by default. You can choose from 0.10 to 4 seconds according to your tastes.
How to Set Up iPhone IOS 18 Eye Tracking
Once you’ve turned on eye tracking, your iPhone’s invisible cursor will begin to track and follow your eye movements. The system will highlight and surround an interactable item with a rectangle box when you gaze at it on a page or within an app. You only need to stare at an item on the screen for a time to tap or pick it, after which the selected action will be carried out.
In iOS 18 eye tracking also activates Apple’s AssistiveTouch feature, and the two complement each other really well. With AssistiveTouch, you can accomplish more with just a glance at your iPhone. The AssistiveTouch button, which appears as a circle in the lower right corner, gives you instant access to a number of settings and shortcuts that usually call for swiping or other actions. You will be able to scroll the screen up, down, top, or bottom by using the options that appear. Additionally, you can slide the screen vertically or horizontally. For the most part, these options function rather accurately.
By simply glancing at your iPhone, you can access Control Center, lock the screen, activate Siri, rotate the screen, change the volume, and much more thanks to AssistiveTouch and Eye Tracking.
Create Hot Spots Using Eye Tracking
On a Mac, we all know how to configure and utilize Hot Corners. On your iPhone and iPad, you may now use Hot Corners with Assistive Touch and Dwell Control settings. It’s interesting to note that the most recent iOS 18 Eye Tracking update improves this experience. When you turn on eye-tracking and hot corners, you can instantly activate an action merely by gazing at that corner of the screen. For example, I have the Home Screen accessible from the upper-right corner. The top-left corner is where the Calibrate iOS 18eye tracking action is by default located.
Here’s how to go about it:
- Navigate to Settings -> Accessibility -> Touch on your iPhone or iPad.
- Select AssistiveTouch from this menu, then hit Hot Corners.
- Now you can designate the Top Left, Top Right, Bottom Left, and Bottom Right of the screen with your favorite actions.
How Reliable Is iPhone Eye Tracking?
The powerful LEDs and infrared cameras on the Apple Vision Pro enable exceptionally good Eye Tracking performance. On the iPhone and iPad, Apple’s eye tracking isn’t the most accurate though, as these devices lack sophisticated tracking cameras. Instead, the front cameras and on-device intelligence of iPhones and iPads track your eye movements.
speaking first about the positive aspects. Eye tracking on an iPhone is quite quick and simple to set up. IOS 18 Eye tracking on an iPhone is entertaining, but it’s not ideal. An iPhone occasionally fails to emphasize the item I’m currently viewing. As a result, I frequently struggle and experiment with alternative eye and head movements, which is why the system emphasizes my intended actions. This usually occurs when I’m scrolling through pages with a lot of text, like the settings app. Nevertheless, viewing images and navigating the home screen were rather accurate. Additionally, most of the time, switching between various options like Bluetooth, Wi-Fi, Airplane mode, and more was rather quick and accurate.
Using the iPad Pro M2, iPhone 12 and iPhone 14 Pro, I employed eye tracking. When I visited the Apple Store to examine the iPhone 16 Plus and iPhone 16 Pro Max, I also checked out eye tracking on both. I think eye tracking functions a little bit better on an iPad than it does on an iPhone. But I think Apple still has a ways to go before iOS 18 eye tracking is a truly remarkable feature.
To get the most out of Eye Tracking on iPhones and iPads, try these few suggestions:
- Keep your iPad or iPhone farther away from your faceโabout 1.5 feet awayโand on a sturdy surface. Proper eye tracking won’t occur if your gadget is too close to your face.
- Keep the device as still as you can if you’re holding it in your hand. You may need to recalibrate if you move the device away from you or if you sit in a different position.
- Turn off the feature and re-configure eye tracking if it doesn’t feel right.
- Make sure to sit in a well-lit location. Eye tracking could not function well in dimly lit or dark environments.
- Refrain from taking a seat near a light source. It could blur your face and make it challenging for the camera to follow your eyes.
Disable iPhone Eye Tracking
On an iPhone, there are two ways to disable eye tracking. Weโve discussed both of them below:
- Select the Eye Tracking toggle and turn it off by going to Settings -> Accessibility -> Eye Tracking. To confirm your selection, click OK on the pop-up.
- As an alternative, iOS 18’s configurable Control Center allows you to add the Eye Tracking button. After adding the control, you can use the Control Center to touch on it to turn iPhone Eye Tracking on or off.
1 thought on “This is How My iPhone’s iOS 18 Eye Tracking Feature Operates, Based on My Trials”