A full year has passed since Apple unveiled its own platform for developers to expand what is possible with Augmented Reality, and the most popular example of this technology is a game that was already popular before it existed. As cool as it can be to explore things in AR, the experiences are almost always fleeting. By their very nature, most AR apps aren't really meant to be relied on as a daily app in your phone. But ARKit wasn't built to bring the next Pokémon Go into existence, but instead to lay the groundwork for a future where we use something other than our phones for computing.
Apple has worked tirelessly to improve ARKit over the last year, and developers have taken to these improvements in big ways. While the rest of us wait for a set of glasses we can wear in order to fully appreciate AR at all times around us, the iPhone you have in your hand right now is a nearly unlimited playground of ideas for when that seemingly inevitable future arrives. So while you still have to observe this future world through the window that is your phone, for now, that future is getting cooler every day thanks to the work in ARKit 2.
Explaining Augmented Reality through ARKit 2
In a weird way, this "review" of ARKit is a little self-defeating. AR isn't cool or useful simply because it's AR, and it's not successful just because it exists. AR is most successful when it not only exists, but it works. The most popular examples of this in recent memory are the Niantic games Ingress and Pokémon Go. These games aren't popular because they're made with AR technologies; they're popular because each of these experiences encourages you to use your phone as more than just a screen with buttons on it.
ARKit as a collection of technologies encourages developers to think about the rest of the world around the phone. Instead of a big game map you have to pinch and swipe around on, what happens when the whole Dining Room table is the map, and you see it all by moving yourself around the table? What if you didn't have to go looking for a tape measure when trying to measure for a broken piece of fence, because you have your phone and it can measure objects in the real world for you? Neither of these concepts is unique in the world of AR, but through ARKit, developers know exactly what the iPhone and iPad are capable of and can build within those boundaries instead of testing around them.
The whole idea here is for AR to be the How instead of the Why, and currently that means showing users the benefits of using their phone in new ways. ARKit apps largely exist in four basic interaction methods:
- Point me at something: Apple's software uses the camera to find a flat surface, and when one has been discovered you can "put" something on that surface.
- See something deeper: Pointing your camera at things like movie posters, business cards, and artwork can turn those fixtures into living works. Videos, animations, and more can appear and give you a much more dynamic experience.
- Leave something in the real world: You can "put" something in the world, and have it stay right where you told it to stay. You can now walk around this thing, and see every angle of it as though it were really there.
- Adding to the real world: Apps, even in your browser, can add or replace things in the real world for you to see, encouraging you to move around and create something new.
Check out our favorite ARKit apps!
These options create a vibrant, healthy mix of apps on the App Store. Some apps go all in on ARKit, making new experiences that blend all of these ideas together. Some take a single concept and add it to an existing app to offer a new way to use the existing features.
The whole point is ARKit as a name shouldn't be something users are actively looking for soon, because these features will simply become ubiquitous. In fact, you may not be aware of this, but if you've used an iPhone 8 Plus camera, you've already used AR without knowing it. Portrait Lighting, especially the versions of it that remove the world around you and replace it with a flat back background, is a great example of invisible AR at play.
Depth, Sensors, and iPhones
Not all iPhones are created equal, and this raises a lot of questions about how different the AR experience is when you use the iPhone X series versus other iPhones. Almost everything else has a single camera on the back and none of the depth-sensing goodies like Portrait Lighting. Many iPhones are also lacking the front-facing camera sensors which enable Face ID. How much of the AR experience do you really gain or lose with each of these phones?
Right now, the answer is not very much. Apple has done a lot to create some depth with single sensors, as we've seen with the software-based portrait mode available in the upcoming iPhone XR. Instead of sacrificing those features entirely, you'll notice a performance difference instead. It takes longer for an older iPhone to see which surfaces are best for the AR object and seeing which real-world object are closer or farther away may not be fast enough to be useful.
This is especially true of accuracy in AR. You won't be able to measure something better with the dual cameras, and if something goes wrong and the phone "forgets" the position of an AR object, it goes wrong the same on the iPhone 8 and iPhone 8 Plus. Apple said the iPhone X had been custom tuned for ARKit, and that turning continues with the new generation of phones. The iPhone XS and XS Max deliver the same AR experiences faster, and the new features in ARKit 2 make everything feel just a little more real as you peer through the camera.
The biggest differences you're going to see moving forward is going to be with eye tracking and face mapping, a feature enabled in ARKit 2 but not used a lot yet. Developers will be able to use the same sensors which allow you to unlock your phone with your face to actually see where you're looking. This could enable a lot of really interesting things, but only on phones with this camera array. Also, the facial mapping tech used for security is also used for Animoji, Memoji, and the depth tech used in Apple's Clips app. If you have a regular sensor on the front of the phone, as you see in an iPhone 8 or earlier, these features won't ever be available to you.
A marathon, not a sprint
How do you measure the future? Is it through the excitement of those you allow to peek at what might come, or is it by accomplishing something previously considered impossible? Does the latter actually matter if you don't have the former? These are a small sample of the questions flying through my mind when I think about AR.
The most important parts of ARKit 2 are the things you can't really see and use every day. Apple has worked to create a file format which enables 3D objects to be easily used all over the web and in AR or VR apps. It's possible for people to share a single AR experience now, which is great for multiplayer games or shared education and art apps. Apple even made its own measuring app, so you can reliably use your phone as an AR tool instead of just a toy. These features, and especially the criminally underused eye tracking AR feature, all come together to make a bright future. But, to be clear, a lot of the best AR experiences are yet to come.
When it does work, AR on the iPhone is a portal to another world. It's pretty similar to your world, but there are some extras that really make you wish you could move there. And while I sit here wishing a lot of what we have now was available in something I could wear instead of hold in my hand, the next step for AR on iPhone is going to be huge.
Updated Sept. 2018: This review has been updated with new information on the iPhone XS and XS Max.
Russell is a Contributing Editor at iMore. He's a passionate futurist whose trusty iPad mini is never far from reach. You can usually find him chasing the next tech trend, much to the pain of his wallet. Reach out on Twitter!