How Apple can detect objects in your photos without creeping your data
As part of the live Talk Show at WWDC 2016, Apple senior vice president of software engineering, Craig Federighi, and senior vice president of worldwide marketing, Phil Schiller, provided more details on how deep learning and artificial intelligence is being used in iOS 10 to surface search results without requiring you to share your data with Apple.
How will Apple index my existing photos?
When you first download iOS 10, if you have existing photos in your library, your iPhone or iPad will begin to process them in the background at night when you're plugged in. That way you won't see any performance degradation or excessive power drain during the day when you're trying to use your iPhone or iPad.
Once complete, all your old photos will be indexed for the new, better search.
What about on macOS?
Same thing. iOS typically gets released a few weeks before macOS (formerly OS X), though, and not everyone with an iPhone or iPad has a Mac, so Apple wants you to be able to enjoy the new search benefits immediately.
When the macOS Sierra update arrives later this fall, it'll index your Mac Photos library the same way.
Wait, won't the search index just sync between devices?
Not right now, but maybe one day. A system would need to be built that securely, privately shared metadata and index information between iPhone, iPad, Mac, and other products.
What about new photos? Do I have to wait for them to index?
Nope! Apple enjoys a tremendous lead when it comes to chipset architecture in mobile, and they're "spending" some of it on deep learning and AI processing immediately when photos are capture.
Master your iPhone in minutes
iMore offers spot-on advice and guidance from our team of experts, with decades of Apple device experience to lean on. Learn more with iMore!
The image signal processor (ISP) inside the Apple A9 already handles an incredible amount of calculations for everything from white balance to burst selection. Deep learning and AI take a couple billion more, but the A9 GPU can still handle those near-instantly.
Why doesn't Apple need me to give them and their servers my photos data to get the indexing done?
According to Federighi, Apple doesn't need our photos to figure out what a mountain looks like in a photo. Their "detectives" managed to look at public domain images and figure that out.
But will it work as well as services that do require photo data sharing?
To be determined. We'll have to wait for iOS 10 to launch this fall and really put it through its paces. Personally, privacy is as valuable to me as money, time, or attention, so having the option is great for customers.
I don't use Google or Facebook photos today, so for me any upgrade will be great. If you aren't sure yet, you'll need to see how much functionality it gives you, weigh the options, and make the best choice for you.
○ Everything about WWDC 2020
○ WWDC 2020 remote lineup
○ Download the Apple Developer app
○ iOS/iPadOS 14
○ macOS 10.16
○ watchOS 7
○ tvOS 14
○ Discussion forums
Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.