Tyler Lee* says it seems that in the near future, users of Apple’s devices will soon be able to interact with them without necessarily having to touch it.
This is thanks to Apple’s updated Vision framework in which starting in iOS 14 and macOS Big Sur, developers will be able to update their apps to detect things like gestures, hand poses, and the human body.
So what does this mean for end users?
Like we said, it does help to bring touchless gestures to Apple’s devices.
One example that Apple has given is in the form of exercise apps, where with the updated Vision framework, it will be able to track the exercises that the user is performing and could then help them correct their poses.
It can also be used in safety-training apps to help encourage correct ergonomics, as well as media editing apps where it can help users to find images or photos with a specific pose that you have in mind.
Users will also, in theory, be able to interact with their mobile devices by holding out a hand pose to trigger the camera’s shutter.
It can also be used in more novel ways where you could hold a hand pose and it will try to see for an appropriate emoji to overlay it (like the OK gesture or the peace sign).
The onus here seems to be largely on developers taking advantage of the Vision framework, so your mileage may vary depending on the apps that you are using.
*Tyler Lee is a writer at Ubergizmo.
This article first appeared at ubergizmo.com