
1. Boo, Facebook
People are starting to dig through the trove of documents known as the āFacebook Papersā and what they are finding is both shocking and totally unsurprising.1
Letās start with the shocking stuff.
Facebook is used to facilitate human trafficking:
Facebook has for years struggled to crack down on content related to what it calls domestic servitude: "a form of trafficking of people for the purpose of working inside private homes through the use of force, fraud, coercion or deception," according to internal Facebook documents reviewed by CNN.
Facebook is used by parties stoking political violence:
Facebook employees repeatedly sounded the alarm on the company's failure to curb the spread of posts inciting violence in "at risk" countries like Ethiopia, where a civil war has raged for the past year, internal documents seen by CNN show.
The social media giant ranks Ethiopia in its highest priority tier for countries at risk of conflict, but the documents reveal that Facebook's moderation efforts were no match for the flood of inflammatory content on its platform.
Facebook has a business interest in pushing American users toward radicalizing content:
In summer 2019, a new Facebook user named Carol Smith signed up for the platform, describing herself as a politically conservative mother from Wilmington, North Carolina. Smithās account indicated an interest in politics, parenting and Christianity and followed a few of her favorite brands, including Fox News and then-President Donald Trump.
Though Smith had never expressed interest in conspiracy theories, in just two days Facebook was recommending she join groups dedicated to QAnon, a sprawling and baseless conspiracy theory and movement that claimed Trump was secretly saving the world from a cabal of pedophiles and Satanists.
Smith didnāt follow the recommended QAnon groups, but whatever algorithm Facebook was using to determine how she should engage with the platform pushed ahead just the same. Within one week, Smithās feed was full of groups and pages that had violated Facebookās own rules, including those against hate speech and disinformation.
Smith wasnāt a real person. A researcher employed by Facebook invented the account, along with those of other fictitious ātest usersā in 2019 and 2020, as part of an experiment in studying the platformās role in misinforming and polarizing users through its recommendations systems.
That researcher said Smithās Facebook experience was āa barrage of extreme, conspiratorial, and graphic content.ā
In India, itās even worse:
On Feb. 4, 2019, a Facebook researcher created a new user account to see what it was like to experience the social media site as a person living in Kerala, India.
For the next three weeks, the account operated by a simple rule: Follow all the recommendations generated by Facebookās algorithms to join groups, watch videos and explore new pages on the site.
The result was an inundation of hate speech, misinformation and celebrations of violence, which were documented in an internal Facebook report published later that month.
āFollowing this test userās News Feed, Iāve seen more images of dead people in the past three weeks than Iāve seen in my entire life total,ā the Facebook researcher wrote.
Much of what is spreading on Facebook is not organic content, but organized disinformation:
Meet SUMAs: a smattering of accounts run by a single person using their real identity, known internally at Facebook as Single User Multiple Accounts. And a significant swath of them spread so many divisive political posts that theyāve mushroomed into a massive source of the platformās toxic politics, according to internal company documents and interviews with former employees.
Company research from March 2018 said accounts that could be SUMAs were reaching about 11 million viewers daily, or about 14 percent of the total U.S. political audience.
Are you shocked? You should be shocked. This is bad. Very bad. Very, super-double bad.
But also, you should not be surprised.
Hereās the obvious, not-even-hiding in plain sight explanation:
Growth über alles. Obviously. Of course. A publicly traded company whose income relies on increasing growth did everything possible to increase growth, irrespective of the external consequences to the world around it.
This isnāt even a dog-bites-man story. Itās a sky-is-blue, water-is-wet, story. Facebook literally told us this was their prime directive. Please go back to this Buzzfeed piece from 2018 about Andrew Bosworth, who has since been elevated to CTO at Facebook. In the piece, Buzzfeed published a memo Bosworth sent to employees explaining exactly what Facebookās guiding principle was. Iām going to excerpt the memo heavily, but itās not long and you should read the whole thing:
We connect people.
That can be good if they make it positive. Maybe someone finds love. Maybe it even saves the life of someone on the brink of suicide.
So we connect more people
That can be bad if they make it negative. Maybe it costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools.
And still we connect people.
The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good. . . .
That isnāt something we are doing for ourselves. Or for our stock price (ha!). It is literally just what we do. We connect people. Period.
Thatās why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends. All of the work we do to bring more communication in. . . .
The natural state of the world is not connected. It is not unified. It is fragmented by borders, languages, and increasingly by different products. The best products donāt win. The ones everyone use win.
I know a lot of people donāt want to hear this. Most of us have the luxury of working in the warm glow of building products consumers love. But make no mistake, growth tactics are how we got here. . . .
In almost all of our work, we have to answer hard questions about what we believe. . . . But connecting people. Thatās our imperative. Because thatās what we do. We connect people.
As I said: The details are shocking. But the story is completely unsurprising. This is what Facebook is and they have never pretended otherwise.
2. Should We Fix Facebook? Could We?
Neither of these questions is rhetorical.
Lots of people get nervous about regulating a business that deals in information in ways they wouldnāt give a second-thought to with a business that made or sold, car seats, or gasoline, or handguns.
If Facebook was a coal mine and it was making a lot of money by selling coal, but there was toxic run-off from the mine impacting neighboring public or private properties or waterways, there would be a host of regulations brought to bear against it.
Thatās because in the physical world we recognized that economic transactions have externalities which canāt be captured organically by market forces. So we use government regulation to price in those externalities. Or to impose some societal constraints on transactions which create consequences for third parties.
Why shouldnāt that be true of an information business? Why donāt we see these downstream effects of Facebookās business as the equivalent of toxic runoff from a coal mine?
The answer is time: Society did not always believe that coal mines were responsible for their toxic runoff. In fact, part of the reason we had coal barons is that capital owners were able to get rich precisely because the externalities of their businesses were not priced in. They were getting free lunches.
But over time, society developed a pretty solid consensus that a business canāt just do whatever it wants.
And I donāt mean to pick on coal mines, because this is true of lots of businesses: Pharmaceuticals, distilling, manufacturing, waste disposal, transportation.
A company canāt bottle a chemical substance and sell it to whoever it likes as āmedicine.ā It canāt just buy any plot of land and build a power plant.
Information technology is not materially different just because it deals in 1ās and 0ās.
So yes, if Facebook can be made to price-in the cost of its external effects, then we should do it. And if it cannot, then we should regulate it.
But how?
No one really knows.
Will Oremus has some ideas for governing the use of algorithms, but these are small-ball and also only address what Facebook is doing in America.
Charlie Warzel points out how hard it would be to āfixā Facebook:
Iām concerned that the āfixesā that could come from this momentum are going to be extremely treacherous, too. Iām also concerned that weāre late (not too lateā¦just late). It strikes me as noteworthy that weāve caught up to what Facebook hath wrought (2012-2020) and Mark Zuckerberg and executive leadership seem to regard that version of Facebook as almost an outdated node of the company. Theyāve got a new digital realm to colonize: The Metaverse!
Which brings me to the final point: Maybe we should view the current incarnation of Facebook as a teachable moment and do what we can to triage the problems it has created, but keep our real focus on establishing a framework for the next iteration of social media while it is still in its embryonic stage.
3. No, Really: Letās Try to Fix It Now.
Thatās what Nicholas Carr argues in a monster piece in the New Atlantis:
The mass media age began quietly on Christmas Eve of 1906, when the inventor Reginald Fessenden aired a program of carols and Bible readings from a makeshift radio studio heād built on the Massachusetts coast a few miles north of Plymouth. Radio had been invented a decade earlier by the Italian electrical engineer Guglielmo Marconi, but its use had been restricted to sending Morse signals to places telegraph lines couldnāt reach, like ships at sea. Fessenden was the first to use āthe wirelessā to broadcast what we would now call a show. . . .
When the telegraph and the telephone arrived, they may have been new things in the world, but they had an important precedent in the mail system. Radio broadcasting had no such precedent. For the first time, a large, dispersed audience could receive the same information simultaneously and without delay from a single source. As would be the case with the Internet nearly a century later, the exotic new medium remained in the hands of tinkerers and hobbyists during its early years. Every evening, the airwaves buzzed with the transmissions of tens of thousands of amateur operators . . .
The amateurs ā adolescent boys, many of them ā played a crucial role in the development of radio technology, and most used their sets responsibly. But some, in another foreshadowing of the net, were bent on mischief and mayhem. Shielded by anonymity, they would transmit rumors and lies, slurs and slanders. The U.S. Navy, which relied on radio to manage its fleet, was a prime target. Officers ācomplained bitterly,ā Slotten reports, āabout amateurs sending out fake distress calls or posing as naval commanders and sending ships on fraudulent missions.ā
The nuisance became a crisis in the early morning hours of April 15, 1912, when the Titanic sank after its fateful collision with an iceberg. Efforts to rescue the passengers were hindered by a barrage of amateur radio messages. The messages clogged the airwaves, making it hard for official transmissions to get through. Worse, some of the amateurs sent out what we would today call fake news, including a widely circulated rumor that the Titanic remained seaworthy and was being towed to a nearby port for repairs. . . .
Although European countries had begun imposing government controls on wireless traffic as early as 1903, radio had been left largely unregulated in the United States. The public and the press, dazzled by the magical new technology, feared that bureaucratic meddling would stifle progress. Government intervention āwould hamper the development of a great modern enterprise,ā the New York Times opined in an editorial just three weeks before the Titanicās sinking. āThe pathways of the ether should not be involved in red tape.ā
The Titanic tragedy changed everything . . . Four months later, Congress passed the Radio Act of 1912.
Politics isnāt just politics. Itās technology. Itās society. Itās everything around us. If you want to get smart about it, sign up for my daily newsletter at Bulwark+.