In anticipation of Global Accessibility Awareness Day, Apple revealed a series of new features that will be added to iOS and iPadOS in the coming months. The highlight of these announcements is an innovative eye-tracking technology that allows users with physical disabilities to navigate their devices more easily without needing to touch them. This new eye-tracking technology uses advanced artificial intelligence to interpret users' eye movements. By utilizing the front camera of the iPhone or iPad, the system can identify where the user is looking, enabling them to select apps, open menus, and navigate the device intuitively.
Although the new technology was initially developed for individuals with physical disabilities, it has broader potential applications. For example, it could be used to control devices remotely or to allow individuals with busy hands (while driving or washing dishes) to operate their devices without touch.
For the hearing impaired, Apple has launched "Music Responses via Vibration." This technology utilizes the iPhone's Taptic Engine to create subtle vibration feedback, textures, and additional responses tailored to the music's rhythm. Furthermore, Apple is making it easier to perform various tasks using voice commands. The "Voice Shortcuts" feature allows users to set up unique voice commands that Siri can understand, whether it's launching an app, completing complex tasks, or setting reminders.
For those prone to nausea when using a device in a moving vehicle, Apple introduces a new solution - "Vehicle Motion Detection." Moving dots on the display will illustrate the vehicle's movements and help reduce the sensation of dizziness without obstructing the screen's view. This feature automatically detects motion and can be turned on or off in the Control Center.