How will “privacy” limit Apple?  

Apple’s been creeping towards privacy and encryption as a differentiator for some time, but last week’s address to EPIC was explicit. Tim Cook accused their neighbors of “lulling their customers into complacency about their personal information” and the digerati have taken note.

The result is some long-overdue public analysis of Google’s learn-everything-and-personalize strategy versus Apple’s. A false dilemma has emerged, driving concern that Apple will be obsessed with privacy at the risk of product quality. Dustin Curtis offers a thoughtful example of this narrative, where he has wisely reframed from privacy to security1 and pointed out that Google’s explicit sale is of user attention rather then data. Representative of the zeitgeist, he worries about “vast improvements in user experience” that Google’s aggregation of user data does or will enable.

So how might Apple’s stance prevent it from providing users the best experience?

Personalized content selection is the most obvious use of user profile and behavior data, but Apple’s clearly not against this given the “over 400 targeting options” they tout for iAd.2 Google can likely target more precisely by sniffing communications, but at risk of crossing the creepy line. As seen with Netflix and Amazon, users tend to prefer a clear, explicit relationship between observation and recommendation.

Usage data for product development is increasingly important. Third-party developers and Apple themselves benefit when users opt-in to diagnostic sharing, but a privacy-steward filter on what they’re willing to collect could slow discovery of product flaws and opportunities. So far, Apple’s designers have been savvy enough, but I often find myself hoping they’re paying close attention when I give up on Apple Maps and switch to Google’s.

Contextual interactions as seen with Google Now are certainly easier to design if you assume a social contract with users to use their every move in any algorithm, but Proactive may be a nuanced approach that makes the sharing contract explicit. Google’s all-knowing-cloud approach is respected, but software agents don’t have to be monolithic. Even if Apple doesn’t know where I have been today, my instance of Siri might.

Population-scale learning simply requires mass data collection. As Nick Heer points out, Apple’s stance will certainly make it more difficult for them to develop a cat-recognition algorithm. But ReseachKit shows that they understand the power. For now, at least, it seems they would rather mediate a trust relationship with third parties for sensitive services like health and money, even if that means they pass up opportunities for keeping the learnings as their own secret. It’s conceivable they could do the same for communications. (It could be argued they already have, by introducing a popular phone and allowing third parties to host messaging platforms on it.)

Each of these are certainly easier to understand with Google’s monolithic approach, but I haven’t yet seen a compelling argument that Apple is completely passing on any UX opportunities by declining to aggregate my private data.


  1. Curtis glosses over a major issue: “So as long as I trust Google’s employees, the only two potential breaches of my privacy are from the government or from a hacker.” At least in Europe, this doesn’t feel like only two little potentialities. Even in the US, there is some evidence that consumer sentiment on privacy is shifting, and if Apple has decided it wants to hasten this shift, their influence should not be underestimated. 

  2. At launch in 2010, they listed specifically “demographics, application preferences, music passions, movie genre interests, television genre interests, location.” 

 
10
Kudos
 
10
Kudos

Now read this

Amusing Ourselves

I recently joined some friends’ podcast to discuss how we’ll end up plugging ourselves into the Matrix. Though it’s a bit cliché to observe how our species is prone to avoiding reality for more pleasurable alternatives, I find the... Continue →