No, Apple's Machine Learning Engine can't surface your iPhone's secrets
Core ML is Apple's framework for machine learning. It lets developers easily integrate artificial intelligence models from a wide variety of formats and use them to do things like computer vision, natural language, and pattern recognition. It does all this on-device, so your data doesn't have to be harvested and stored on someone else's cloud first. That's great for privacy and security, but it doesn't prevent sensationalism:
Wired, in an article I'd argue should never have made it into publication:
It's less likely some people worry and more likely they saw a new technology and figured they could stick it and Apple in a headline and get some attention — at the expense of consumers and readers.
There's no data that an app can access through Core ML that it couldn't already access directly. From a privacy perspective, there's nothing harder in the screening process either. The app has to declare the entitlements it wants, Core ML or no Core ML.
This reads like complete FUD to me: Fear, uncertainty, and doubt designed to get attention and without any factual basis.
It could be everything. Core ML could make it more efficient for an app to find very specific data patterns to extract but, at that point, an app could extract that data and all data anyway.
Theoretically, finding and extracting a few photos might be easier to hide than simply pulling a large number or all photos. So could trickle uploading over time. Or based on specific metadata. Or any other sorting vector.
Master your iPhone in minutes
iMore offers spot-on advice and guidance from our team of experts, with decades of Apple device experience to lean on. Learn more with iMore!
Just as theoretically, ML and neural networks could be used to detect and combat these kinds of attacks as well.
Also nothing unique to Core ML. Smart spyware would try to convince you to give it all your photos right up front. That way it wouldn't be limited to preconceived models or be at risk of removal or restriction. It would simply harvest all your data and then run whatever server-side ML it wanted to, whenever it wanted to.
That's the way Google, Facebook, Instagram, and similar photo services that run targeted ads against those services already work.
I get putting Apple in a headline garners more attention but including Google's TensorFlow Mobile only once and only as an aside is curious.
Will is smart. It's great that Wired went to him for a quote and that it was included. It's disappointing that Will's quote was included so far down and unfortunate for all involved that it didn't get Wired to reconsider the piece entirely.
The bottom line here is that, while machine learning could theoretically be used to target specific data, it could only be used in situations where all data is already vulnerable.
Beyond that, Core ML is an enabling technology that can help make computing better and more accessible for everyone, including and especially those who need it the most.
By sensationalizing Core ML — and Machine Learning in general — it makes people already fearful or worried about new technologies even less likely to use and benefit from them. And that's a real shame.
Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.