(Pocket-lint) - Apple is one notable tech company to utilise LIDAR - which stands for Light Detection and Ranging - in its products, with the iPhone 13 Pro and Pro Max the second of its phone series to feature such a new scanner. But why?
Is LIDAR for Face ID?
No, the LIDAR sensor isn't there for advancing Face ID face-scanning login, rather it's for use in Augmented Reality (AR) applications.
Ultimately, the LIDAR scanner features in the iPhone 13 Pro series and iPad Pro models to enhance the accuracy of distance and measurement - something that's renowned for being not especially accurate using just camera sensors alone.
How does LIDAR work?
LIDAR is a pulsed laser that records the time it takes - at nano-second speeds - for that signal to return to source, enabling it to generate a 3D model with greater accuracy than just a simple camera ever could.
Think of it like the way a bat 'sees' the world around it; those pulses create an image. In the case of the iPad Pro and iPhone 123 Pro, the LIDAR image is one part of the puzzle, using data from motion sensors and cameras to help with accuracy.
Is it like Time-of-Flight?
Some camera systems have a similar Time-of-Flight (ToF) sensor to assist with depth information. Typically these take a more generalised snapshot with a wider beam, whereas more advanced LIDAR systems offer multiple beams for greater accuracy.
Will LIDAR improve ARKit applications?
That's the idea. Every ARKit app will "automatically get AR placement, improved motion capture and people occlusion", says Apple.
How will Apple apps benefit?
It's all about accuracy. Take the Measure app, for example, which will be able to calculate, say, height with accuracy. That wasn't possible before - not accurately anyway. There's a new Ruler View that will be available in an update.
Developers will be able to use the LIDAR data as best benefits their application, using a new Scene Geometry API. We're sure there will be various examples in gaming, science, fitness and other applications in the future.
Where can I use Apple's LIDAR system?
Apple's LIDAR scanner is said to be able to scan objects up to five metres away. As the laser works at a photon level, it's able to function night or day, inside or out. However, it can't 'see' through objects - so rain, passing people, and so on, could confuse the system.
What else uses LIDAR?
We've seen LIDAR used in various applications before, the most prominent being automotive systems designed to assist self-driving cars or automatic steering safety features. There are also construction applications and more.