Apple’s always put a big emphasis on privacy, easily setting itself apart from data-hungry rivals like Google, Facebook and Amazon. But now the Cupertino company is dipping its toe into data gathering and it’s already raising some concerns from security experts.
Apple is trying something new in iOS 10 called “Differential privacy.” The experimental technology promises to gather accurate user data without revealing any private information. Here’s how Apple explains it in the official iOS 10 preview guide:
To obscure an individual’s identity, Differential Privacy adds mathematical noise to a small sample of the individual’s usage pattern. As more people share the same pattern, general patterns begin to emerge, which can inform and enhance the user experience. In iOS 10, this technology will help improve QuickType and emoji suggestions, Spotlight deep link suggestions and Lookup Hints in Notes.
Apple’s software engineering chief Craig Federighi also mentioned it briefly during WWDC 2016, claiming the company’s made some big strides in developing the mostly theoretical technology. “Apple has been doing some super-important work in this area to enable Differential privacy to be deployed at scale,” he said.
It’s possible Apple has made some huge advancements in Differential privacy, but not everyone is convinced. Matthew Green, a cryptology professor at Johns Hopkins University, quickly chimed in with his opinion.
Speaking to Gizmodo, Green added that Differential privacy is still very experimental. “It’s a really neat idea,” he said, “but I’ve never really seen it deployed.”
If Apple can pull this off, it may score a huge win against Google, offering smarter data-driven suggestions without compromising on privacy. But the alternative could mean inaccurate predictions or, even worse, a lapse in security. We won’t know either way until iOS 10 gets into enough hands for Differential privacy to start working, and that’s not exactly a comforting thought.