InternetSoftwareTechnology News

Apple Unveils iOS 18 Accessibility Features, Including AI-Powered Eye Tracking

Apple Unveils iOS 18 Accessibility Features, Including AI-Powered Eye Tracking

Apple recently unveiled some new and exciting accessibility features in iOS 18. These new features, such as Taptic Engine, Eye Tracking, Vocal Shortcuts, Music Haptics, and more for iOS and iPad OS can empower people with disabilities.

The company said that these features are ready to hit the market later 2024 and are expected to be a part of the iOS 18 update. You can expect the features to be available with other future software updates such as iPadOS 18, and visionOS 2. When it comes to talking about accessibility features, this tech giant never fails to impress its users. This year again, these new and exciting accessibility features are announced by Apple in order to help users with disabilities. Among the beneficial features that Apple previewed, eye tracking grabs the most attention. It is a unique feature of Apple Vision Pro and will be available on iPhone and iPad.

What Did Apple Say?

In order to enrich lives, Apple always tries to bring innovative features while having faith in the transformative power of innovation. That’s why this tech giant has championed inclusive design while embedding accessibility at the core of their hardware as well as software. Their intention is always to push the boundaries of the technology level and through these new features, Apple is trying to provide the best experience to its users.

iOS 18 Accessibility Features To Look For:

Eye Tracking:

Eye tracking is a feature powered by AI. This feature enables you to navigate your device with your eyes if you are an iPhone or iPad user. It is similar to what was seen on the Apple Vision Pro. This Eye Tracking feature on iPhones and iPads is mainly designed for people with physical disabilities to allow them to control their devices by looking at them.

In this case, a front-facing camera is used in order to determine the element you are viewing. As a user, you only need to gaze at a button to make this highlighted. After that, you need to hold the gaze for a few seconds so that you can select the button using your eyes. You can see every data stored on your device because of the on-device machine learning technology. The best thing is that all data will remain protected and will not even be shared with Apple.

This feature can work across iOS and iPadOS apps. So there is no need to use extra hardware or accessories. If you are willing to activate any element and also access functions including swipes, physical buttons, and other gestures with eyes, your job is to use the Dwell Control.

Music Haptics:

This feature is designed especially for people with hearing disability. It helps them to experience music on their iPhones via haptic feedback (powered by Taptic Engine). Once you enable this accessibility feature, the Taptic Engine will begin playing textures, taps as well as refined vibration to the music’s audio. Developers get this feature as an API because of which music can be accessed more easily in their apps. It can work for many songs in the Apple Music catalogue.

Vocal Shortcuts:

iPhone and iPad users are capable of designating Siri with verbal instructions using the feature. Thus, they are able to launch shortcuts and complete actions. Atypical Speech is one of the features which let you widen the scope of speech recognition. Apple says that the vocal shortcut feature has the ability to recognize vocal patterns via ML technology.

Vehicle Motion Cues:

For iPhone and iPad users, it is a new experience. With the help of this feature, reducing motion sickness is possible for those people who are in moving vehicles. According to research, a sensory conflict between your view and your feelings causes motion sickness. As a result, users may not use their iPhone or iPad comfortably while in a moving vehicle.

With this feature, animated dots on the screen edges help to represent changes in vehicle motion so that it can reduce sensory conflict without interfering with the actual content. This feature uses the built-in sensors of iPhone and iPad to recognize when you are in a moving vehicle and according to this, it will respond. In order to show the feature on your iPhone automatically, you can set it. Moreover, you are able to turn the feature on or off in the Control Center.

Enhanced CarPlay Accessibility:

CarPlay is going to be available with three new accessibility features, including sound recognition, color filters, and voice control.

  • Voice Control:

With the help of voice control feature, you can navigate CarPlay. It allows you to use your voice in order to control apps.

  • Sound Recognition:

This feature is able to alert drivers to nearby sirens and is mainly designed for the deaf or those who have hearing issues.

  • Color Filters:

This feature improves the CarPlay interface for color-blind users.

These features, which can be launched with iOS 18, give a hint at a major focus on machine learning and artificial intelligence technology at the future WWDC event.

VisionOS Accessibility Updates:

VisionOS is expected to bring new accessibility features such as Smart Invert, Live Captions for FaceTime, and a Dim Flashing Lights feature to boost usability for all users.

  • Live Captions: It can help you to follow spoken dialogue in apps and live conversations.
  • Apple Vision Pro Updates: These will enhance the caption control. Also, it offers support for more hearing devices.
  • Additional Vision Accessibility Updates: It comes with Smart Invert and Reduce Transparency features which are useful for people with low vision.

These features are already available in the Apple Vision Pro in order to provide a flexible input system as well as an intuitive interface that is designed to aid the maximum number of users. With the help of VoiceOver, Zoom, and Color Filters features, blind people or users with low vision can access spatial computing. Moreover, you are capable of controlling Vision Pro with your hands, eyes or voice. In this case, you need to use accessibility features such as Switch Control, Sound Actions, etc. In addition, a feature called Dwell Control exists that is beneficial for people with physical disabilities.

Additional Updates:

  • VoiceOver: This feature will come with new voices, custom volume control, a flexible Voice Rotor, etc which are advantageous for blind people or the users with low vision. Besides, Mac users can use the feature to customize VoiceOver keyboard shortcuts.
  • Magnifier: The Magnifier will be available with a new Reader Mode. In addition, it will include an option allowing you to easily launch Detection Mode with the Action button.
  • Braille Screen Input: Braille users will be able to get a new way in order to start as well as stay in Braille Screen Input for faster control & text editing. Moreover, the Japanese language will be available for Braille Screen Input. It will be compatible with multi-line braille with Dot Pad. There will be an option letting you choose different input and output tables.
  • Hover Typing: People with low vision can use the hover typing feature to see larger text when they type in a text field and in their preferred font & color.
  • Personal Voice: If you are one of those who is at risk of losing speaking ability, then you should use the Personal Voice feature, which is available in Mandarin Chinese. If you can’t pronounce or read full sentences easily, you should create a Personal Voice using shortened phrases.
  • Live Speech: This feature will be available for non-speaking users. It adds categories as well as simultaneous compatibility with Live Captions.

More iOS 18 Accessibility Features:

  • Virtual Trackpad for AssistiveTouch: This feature is useful for people with physical disabilities, as it allows them to control their devices using the screen’s small region as a resizable trackpad.
  • Switch Control: This feature comes with an option with the help of which using camera to recognize finger-tap gestures as switches in iPhones and iPads will be possible.
  • Voice Control: It is one of the features that provides support for complex words and custom vocabulary.
  • You can see the Hover Typing experience and the new Reader Mode in Magnifier on iPhone 15 Pro.

Celebrate Global Accessibility Awareness Day With Apple:

Apple brings new exciting features along with some curated collections to celebrate the Global Accessibility Awareness Day:

  • Apple Store locations can host free sessions, allowing the customers to explore these accessibility features.
  • You can visit Apple Support to learn the way of customization using default accessibility features. Apple Accessibility playlist allows users to learn the way of personalizing Apple Vision Pro, iPhone, iPad, Apple Watch, and Mac by helping them to adapt the gestures to customize the way in which information is presented on the screen of the device.

The Bottom Line:

Apple, the tech giant has previewed a few accessibility features which are expected to arrive with iOS 18, iPadOS 18, and visionOS 2 later 2024. Some important accessibility features of iOS 18 and iPadOS 18 include— Eye Tracking, Vocal Shortcuts, Vehicle Motion Cues, Music Haptics, and CarPlay improvements. If you are an iPhone or iPad user, you can navigate your device with your eyes using the eye tracking feature.

Frequently Asked Question: 

  1. Do iPhones track your eyes?

Yes. The eye-tracking feature of iPhones, which is useful for people who are disabled physically, uses Artificial Intelligence and the front-facing camera in order to calibrate in seconds to permit the users to navigate their iPad and iPhone with their eyes.

  1. Which iPhone is best for blind users?

The iPhone 15 Plus is considered the best model for visually impaired users as this model comes with a large screen and extensive accessibility features.

  1. Can you use Apple Vision with one eye?

Yes, you are able to adjust the settings to use Apple Vision with only one eye.



The author FileEdge