iOS 16 Allows Playing Realistic Hands-Free Actions

0
421

New functionality within iOS 16 allows apps to run real-world actions without using hands.

This means that users can do things like start playing music by entering a room or turn on an exercise e-bike just by riding on it.

Apple told developers in a session hosted during the Worldwide Developers Conference that these actions can also be run hands-free even if an iOS user is not actively using the app at the time.

The update, which takes advantage of Apple’s Near Interaction framework, could lead to some important use cases where the iPhone becomes a way to interact with objects in the real world, if developers and accessory makers choose to adopt the technology.

During the session, Apple demonstrated how apps today can connect and exchange data with Bluetooth Low Energy accessories even while running in the background.

However, in iOS 16, apps are able to initiate a close interaction session using the Bluetooth Low Energy extension that also supports Ultra Wideband in the background.

Regarding this, Apple has updated specifications for accessory manufacturers to support these new background sessions.

This paves the way for a future where the line between applications and the physical world will be blurry. But it remains to be seen if third-party device and app makers choose to use the functionality.

The new feature is part of a broader update to Apple’s Near Interaction framework, which has been the focus of the developer session.

This framework was introduced at WWDC 2020 with iOS 14. This framework allows third-party app developers to take advantage of the U1 chip within Apple devices and other third-party extensions.

iOS 16 arrives later in the year

This framework powers AirTag’s precision search capabilities that allow iPhone users to open the Find My app to be directed to the exact AirTag location using on-screen directional arrows along with other instructions that let you know how far you are from the AirTag or if it could be AirTag is on a different floor.

App developers are able to create apps that do such things with iOS 16. This is due to the new ability that allows them to integrate ARKit – Apple’s augmented reality developer toolkit – with the Near Interaction framework.

This allows developers to take advantage of the device path as computed from ARKit. And their devices can direct the user to a misplaced item or other object that the user might want to interact with.

By making use of ARKit, developers get more consistent distance and direction information. This is compared to the use of close interaction alone.

However, it is not necessary to use the function with AirTag-like accessories manufactured by third parties only.

Apple has experimented with another use case where the museum can use ultra-wideband extensions to guide visitors through its exhibits.

In addition, this feature can be used to overlay directional arrows or other AR objects on top of the camera’s view of the real world as it helps direct users to an Ultra Broadband object or accessory.

Apple also briefly showed how red AR bubbles can appear across the app screen above the camera view to indicate the way.

In the long run, this functionality lays the foundation for Apple’s mixed reality smart glasses. Augmented reality apps are supposed to be the core of the experience.

The updated functionality is rolling out to beta testers of iOS 16, which reaches everyone later this year.

Leave A Reply

Please enter your comment!
Please enter your name here