Apple may have just baked in a new accessibility feature in the latest beta version of iOS which is said to help visually impaired users in maintaining social distancing.
TechCrunch first reported on the new feature, suggesting that it can also measure the distance to people in the view of the iPhone’s camera. The feature titled “people occlusion” has emerged from Apple’s ARKit 4, which detects the shape of people. To recall, ARKit is Apple’s AR platform for iOS devices.
ARKit 4 created a brand-new depth application programming interface, introducing a new method to access the detailed depth information gathered by the LiDAR scanner on the iPhone 12 Pro, iPhone 12 Pro Max, and iPad Pro.
The accessibility team discovered that when combined with the accurate distance measurements provided by the LiDAR scanner could be extremely useful for people with visual impairments. The report suggests that the feature should be available on the iPhone 12 Pro and iPhone 12 Pro Max running on iOS 14.2.
Here’s how people detection works in iOS 14.2 beta - the voiceover support is a tiny bit buggy but still super cool https://t.co/vCyX2wYfx3pic.twitter.com/e8V4zMeC5C— Matthew Panzarino (@panzer) October 31, 2020
The report said that the part of the Magnifier app uses the LiDAR and wide-angle camera of the iPhone 12 Pro and Pro Max, giving feedback to users in a variety of different ways. The report noted that this wasn’t the first time that a tool like this is introduced on phones or dedicated devices, but it doesn’t often come baked in as a standard feature.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.