What is LiDAR, and why do I want it in my iPad?
While there are many changes coming to the next iPads Pro from Apple, you only need to look at it for a moment to see one of the more obvious changes. The cameras on this new tablet series are different, and not just from the previous generation. These new iPads are packing tech you won't find in the latest iPhone either, which is why the camera cut-out looks like nothing else you've seen from Apple before.
A big part of this change is the inclusion of a new sensor, called a LiDAR Scanner. It sits alongside the camera array to help make a handful of things more useful. Need to know more? Keep reading!
What is LiDAR?
You might have heard of SONAR and RADAR, so it might seem like LiDAR is another one of those detection mechanisms. While it's true there systems like self-driving cars use RADAR and LiDAR together, these technologies function very differently from one another. LiDAR uses light to measure distance, specifically green spectrum lasers.
When a LiDAR Scanner is used, it sends multiple pulses of nearly invisible green light per second in one direction. The purpose of these pulses is to measure how long it takes for the light to get back to the sensor, and to do this thousands of times every minute to offer a "picture" of the environment in front of the scanner.
In larger deployments, like cars and airplanes, LiDAR systems can offer a living snapshot of things at multiple depths in order to offer a computer a similar perception of depth we have in human eyes. This makes it possible for the computer to "see" far enough away that it's safe to use while the vehicle is in motion, which is great. But it's efficacy is determined by power output and size, so the smaller the sensor the shorter its effective distance of measurement.
Why do I want LiDAR in my iPad Pro?
Like USB-C in the previous generation and stereo speakers in 2015, Apple's iPad Pro line is a testing ground for new tech that's useful for professionals. LiDAR is incredibly useful for developers who want to build for the future. In particular, Apple's push into Augmented Reality over the last couple of years has needed specialist hardware to increase accuracy.
Developers want to be able to quickly map a table and build an environment over it, but as you can see in the AR Plus mode in Pokemon Go, building an AR environment on the fly still requires a moment and action on behalf of the user to set everything up. Based on what Apple has promised with more accurate measurement tools in its iPad Pro demo app, it should be possible to simply open an app and have more accurate depth tools that start working instantly. No more waving your device around to detect a surface!
Master your iPhone in minutes
iMore offers spot-on advice and guidance from our team of experts, with decades of Apple device experience to lean on. Learn more with iMore!
The bottom line? If you want to live on the bleeding edge of AR development, this new feature is going to be a very cool thing to have while developers explore what is possible over the next couple of months. If you're not as excited by this, you're probably fine waiting until this tech comes to your iPhone.
Or, who knows, maybe a slick set of glasses?
Russell is a Contributing Editor at iMore. He's a passionate futurist whose trusty iPad mini is never far from reach. You can usually find him chasing the next tech trend, much to the pain of his wallet. Reach out on Twitter!