With the arrival of iOS 14.2, Apple has rolled out a new feature that will delight visually impaired users of iPhone 12 Pro and 12 Pro Max. Person detection allows you to know who is nearby and to be better informed.
If you were wondering what the LiDAR sensor of the iPhone 12 Pro and 12 Pro Max could be used for, and incidentally the iPad Pro 2020, except for the photo, the visually impaired people will answer you the best.
A game-changing innovation
With the rollout of iOS 14.2, Apple launched the feature Person detection within its Magnifier option. This will allow visually impaired people to feel less isolated in an environment that they do not necessarily control. “Person detection is a game changer for people like me in the blind community “, Explain Dave Steele, known as the Blind Poet (The Blind Poet).
— Dave Steele (@BlindPoetRP) November 19, 2020
Visually impaired, this dynamic Briton does not hide the fact that it is not always easy to move around when we cannot see what is around us. Of course, smartphones have evolved and can now provide vocal or sensitive support to better move around and understand your environment. But not necessarily anticipate situations.
The LiDAR sensor will thus replace in a way the white cane or guide dog for the blind by providing information and a better understanding of the environment. The VoiceOver feature, which will narrate whatever the camera captures, can be used in addition. The iPhone is thus able to understand the distance between the user and a nearby person (up to 4.5) and inform him. Detection is done in real time.
A technology originally created for video games
This technology (calculation of the travel time of a light wave back and forth between the iPhone and the targeted object) was originally designed for video games and integrated into ARKit in this sense. It is by working together that the LiDAR and Accessibility teams and the team working on ARKit compared their approaches and achieved this function. Person detection works in any situation, whether you’re moving or static.
The iPhone will inform the user via haptic feedback (adjustable according to each person) or by voice command. The closer the person with a disability is to another individual, the stronger the feedback. Audio playback can also inform you of the distance between the closest person and you. You can also set a safety distance, a different audio tone depending on the situation. One way to thus offer a certain security while reacting to the environment which is constantly scanned.
And the first tests carried out by the people concerned show an advance as psychological as for everyday life. “It really takes the anxiety out of a lot of the things most people take for granted, like social distancing and navigating places that are normally difficult.Adds Dave Steele. “This helps me in particular in a queue to know that I have to move forward, where the person is in front of me and to whom I must turn when I arrive at the cash register. We can better judge the distance this way. “
And for him, the period is more than conducive to this kind of innovation. With the Covid-19 raging, the detection of people makes it possible not to be afraid of others or to go out. “This makes it possible to avoid isolation while offering a certain independence ”,he congratulates himself.
How do I activate person detection?
- Update iOS 14.2
- Go to Settings / Accessibility / Magnifier
- Activate Magnifier
- Magnifier icon appears on Home, most often next to the camera
- Open the app to use it.
- A simplified camera menu appears. Click on the person detection icon on the right
- Point the device in front of you.
You can configure the settings from the toothed wheel, choose the safety distance, but also the choice of response feedback (sound, voice, vibrations).