In case you missed it yesterday, Apple just gave us our first official look at iOS 16. Of course, it didn’t actually name it — as it did last year, the company highlighted several Upcoming accessibility improvements to its operating systems, saying only that it will come “later this year with software updates across Apple platforms.” This is a base code for iOS 16, iPadOS 16, watchOS 9, and macOS 13.
While the features announced by Apple are great improvements for those with various visual, speech, or movement disabilities, they also speak to some overall improvements — particularly in artificial intelligence and machine learning — that we’re likely to see over future generations of Apple’s operating systems. . If we read the announcements, here are some of the major developments we can expect to see in iOS 16:
Live Comments = Better Speech Recognition
Android has had the live feedback feature since version 10, and now Apple will have it three years later. With this setting enabled, your iPhone or Mac (if it has Apple silicon) will automatically produce real-time captions for any audio content, including videos, FaceTime calls, phone calls, and more. It’s a natural extension of on-device speech processing that was introduced last year in iOS 15, but it speaks volumes for a significant improvement in the evolution of this feature.
Hopefully, this will mean an improvement in Siri’s understanding of your commands and dictations, but one can easily see these features appear in other places. Take, for example, the Notes app, where one can imagine a ‘copy’ feature to create text from any audio or video recording. If Apple describes it as an accessibility feature, Live Caption would need to be very powerful, and it also opens up a world of possibilities for the rest of iOS 16.
Apple Watch Mirroring = AirPlay Improvements
Another accessibility feature coming later this year will let you mirror your Apple Watch to your iPhone and use your iPhone screen to control your watch. It’s designed to make it easier to handle the items for those with mobility issues and allow disabled users to enjoy all of their iPhones. Additional accessibility features.

Apple will allow the Apple Watch to mirror later this year — thanks to new developments in AirPlay.
apple
However, the Apple Watch reversal also has interesting implications. Apple says the feature “uses hardware and software integration, Including developments based on AirPlay. ” this, no necessarily It means we’ll see something like AirPlay 3, but it looks like there are some improvements coming to AirPlay, possibly in the way of new frameworks for developers.
Notably, this appears to allow devices to communicate with the control target in a way that AirPlay now does not allow. AirPlay pushes audio and video to devices, allows simple controls (play/pause, volume, etc.), but allowing AirPlay-compatible devices to point to advanced touch controls looks new and could lead to some amazing new features.
Here’s a killer scenario: If Apple could mirror your Apple Watch to your iPhone and let you fully interact with it, you’d probably mirror your iPhone to your Mac or iPad and do the same! That alone would be a game-changing feature.
Door detection = Real-world AR object recognition
Apple has been quietly improving object recognition for some time now. For example, you can search for all kinds of things in the Photos app and get photos that contain them, and iOS 15 has added a neat element Visual search feature The camera is used to identify plants, animals, famous landmarks, artwork, and other things.
Apple has now announced that it will add the ability to detect doors in real time using the Magnifier app, including judging the distance and reading text on them. It is only intended for devices with LiDAR technology (which determines how well it is measured), but it speaks to a broader improvement in object recognition.

The iPhone camera will soon be able to detect if the doors are open.
apple
The most obvious use case is augmented reality goggles or goggles, which aren’t expected to be released until next year at the earliest. But Apple already has a powerful ARKit framework for developers, which is used for AR applications, and includes the ability to recognize and track certain everyday objects. And it wouldn’t be normal for Apple to preview new technology that hasn’t been released for a while.
It seems reasonable to assume that Door Detection is a natural extension of the work Apple is already doing in the augmented reality and object detection landscape. So don’t be surprised if you see a demo at WWDC of the new ARKit framework features for developers. It may start in iOS 16 with new AR apps, but it’s also bound to pop up in much larger projects as Apple continues to push its AR software tools toward eventual integration into AR glasses.