To the Editor:
Facebook founder and CEO Mark Zuckerberg spent 10 hours, over the course of two days, testifying before both houses of Congress last week. That news rightly took a back seat to reports of a grisly chemical weapons attack against people in the Syrian town of Douma, and of President Trump’s promises, now partly carried out, of a “very tough” military response. In that context, it might seem offensive to discuss the travails of a social media company and service that billions of us use and that many of those same billions disdain. But given Facebook’s size, its reach and what it represents, last week’s hearings, even if more performance than substance, are also worth caring about. The insightful students in “Ethics In A Digital World,” a Philosophy course I’m teaching this semester, identified three lessons we should take from last week.
First, telling people to “just quit Facebook” is unhelpful. For many people in the U.S. and beyond, Facebook is “the internet,” or at least the web. Not everyone has the option to “just leave” a site that they rely on to remain connected to, and supported by, communities they’ve formed there.
Second, telling people to “just quit Facebook” is oblivious. Quitting Facebook doesn’t remove “you” or your information from Facebook’s shadow profiles, or from the various third parties that have already collected and aggregated it. Facebook’s repeated reassurances that “users can control their data” are empty. We can’t control what we don’t know is happening. It’s further insulting by putting the onus on Facebook’s users, rather than onto the company and its business model.
Third, it’s not just Facebook. The kinds of regulation, or even basic restraint, that many are demanding require a fundamental rethinking of what companies like Facebook, Google and Amazon are, and of whether the internet that they have constructed really is the one that we and our societies need.
Vance Ricks, Associate Professor of Philosophy