Why these Facebook research scandals are different

How the company found itself in its biggest crisis since Cambridge Analytica

Why these Facebook research scandals are different
(Bernard Hermant / Unplash)

I.

A week ago, the Wall Street Journal began to publish a series of stories about Facebook based on the internal findings of the company’s researchers. The Facebook Files, as they are known, lay out a dizzying number of problems unfolding on the world’s biggest social network.

The stories detail an opaque, separate system of governance for elite users known as Xcheck; provide evidence that Instagram can be harmful to a significant percentage of teenage girls; and reveal that entire political parties have changed their policies in response to changes in the News Feed algorithm. The stories also uncovered massive inequality in how Facebook moderates content in foreign countries compared to the investment it has made in the United States.

The stories have galvanized public attention, and members of Congress have announced a probe. And scrutiny is growing as reporters at other outlets contribute material of their own.

For instance: The MIT Tech Review found that despite Facebook’s significant investment in security, by October 2019, Eastern European troll farms reached 140 million people a month with propaganda — and 75 percent of those users saw it not because they followed a page but because Facebook’s recommendation engine served it to them. ProPublica investigated Facebook Marketplace and found thousands of fake accounts participating in a wide variety of scams. The New York Times revealed that Facebook has sought to improve its reputation in part by pumping pro-Facebook stories into the News Feed, an effort known as “Project Amplify.” (To date this has only been tested in three cities, and it’s not clear whether it will continue.)

Most Facebook scandals come and go. But this one feels different than Facebook scandals of the past, because it has been led by Facebook’s own workforce.

II.

The last time Facebook found itself under this much public scrutiny was 2018, when the Cambridge Analytica data privacy scandal rocked the company. It was a strange scandal for many reasons, not least of which was the fact that most of its details had been reported years previously. What turned it into an international story was the idea that political operatives had sought to use Facebook’s vast trove of demographic data in an effort to manipulate Americans into voting for Donald Trump.

Today nearly everyone agrees that what Cambridge Analytica called “psychographic targeting” was overblown marketing spin. But the idea that Facebook and other social networks are gradually reshaping whole societies with their data collection, advertising practices, ranking algorithms and engagement metrics has largely stuck. Facebook is an all-time great business because its ads are so effective in getting people to buy things. And the company wants us to believe it isn’t similarly effective at getting people to change their politics?

There’s a disconnect there, one that the company has never really resolved.

Still, it plowed $13 billion into safety and security. It hired 40,000 people to police the network. It developed a real aptitude at disrupting networks of fake accounts. It got more comfortable inserting high-quality information into the News Feed, whether about COVID-19 or climate change. When the 2020 US presidential election was over, Facebook was barely a footnote in the story.

But basic questions lingered. How was the network policed, exactly? Are different countries being policed equitably? And what does looking at a personalized feed like that every day do to a person, or to a country and its politics?

As always, there’s a risk of being a technological determinist here: to assume that Facebook’s algorithms are more powerful than they are, or operate in a vacuum. Research that I’ve highlighted in this column has shown that often, other forces can be even more powerful — Fox News, for example, can inspire a much greater shift in a person’s politics.

For a lot of reasons, we would all stand to benefit if we could better isolate the effect of Facebook — or YouTube, or TikTok, or Twitter — on the larger world. But because they keep their data private, for reasons both good and bad, we spend a lot of time arguing about subjects for which we often have little grounding in empiricism. We talk about what Facebook is based on how Facebook makes us feel. And so Facebook and the world wind up talking past each other.

At the same time, and to its credit, Facebook did allocate some resources to investigating some of the questions on our minds. Questions like, what is Instagram doing to teenage girls?

In doing so, Facebook planted the seeds of the current moment. The most pressing questions in the recent reporting ask the same question Cambridge Analytica did — what is this social network doing to us? But unlike with that story, this time we have real data to look at — data that Facebook itself produced.

III.

When I talk to some people at Facebook about some of this, they bristle. They’ll say: reporters have had it out for us forever; the recent stories all bear more than a faint trace of confirmation bias. They’ll say: just because one researcher at the company says something doesn’t mean it’s true. They’ll ask: why isn’t anyone demanding to see internal research from YouTube, or Twitter, or TikTok?

Perhaps this explains the company’s generally dismissive response to all this reporting. The emotional, scattered Nick Clegg blog post. The CEO joking around about it. The mainstream media — there they go again.

To me, though, the past week has felt like a turning point.

By now, the majority of Facebook researchers to ever speak out about the company in public have taken the opportunity to say that their research was largely stymied or ignored by their superiors. And what we have read of their research suggests that the company has often acted irresponsibly.

Sometimes this is unintentional — Facebook appears to have been genuinely surprised by the finding that Instagram seems to be responsible for the rise in anxiety and depression for teenage girls.

Other times, the company acted irresponsibly with full knowledge of what it was doing, as when it allocated massively more resources for removing misleading content in the United States as it does in the rest of the world.

And even in the United States, it arguably under-invested in safety and security: as Samidh Chakrabarti, who ran Facebook’s civic integrity team until this year, put it: the company’s much-ballyhooed $13 billion investment represents about 4 percent of revenue.

Despite all this, of course, Facebook is thriving. Daily users are up 7 percent year over year. Profits are up. The post-pandemic ad business is booming so hard that even digital ad also-rans like Pinterest and Twitter are having a banner year. And Facebook’s hardware business is quietly turning into a success, potentially paving a road from here all the way to the metaverse.

But still that question nags: what is this social network doing to us? It now seems apparent that no one at the company, or in the world at large, has really gotten their arms around it. And so the company’s reputation is once again in free fall.

One natural reaction to this state of affairs, if you were running the company, would be to do less research: no more negative studies, no more negative headlines! What’s Congress going to do, hold a hearing? Who cares. Pass a law? Not this year.

When Facebook moved this week to make it harder for people to volunteer their own News Feed data to an external research program, it signaled that this is the way it is heading.

But what if it did the reverse? What if it invested dramatically more in research, and publicly pressured its peers to join it? What if Facebook routinely published its findings and allowed its data to be audited? What if the company made it dramatically easier for qualified researchers to study the platform independently?

This would be unprecedented in the history of American business, but Facebook is an unprecedented thing in the world. The company can’t rebuild trust with the larger world through blog posts and tweet storms. But it could start by helping us understand its effects on human behavior, politics, and society.

That doesn’t seem to be the way things are going, though. Instead, the company is doing different kinds of research — research like, what happens if we show people good news about Facebook? I’m told one story that appeared in the recent test informed users of an incident in which the social network helped a woman find her lost horse. Maybe that will move the needle.

But I shouldn’t joke. There’s a real idea embedded in that test, which is that over time you can reshape perception by the narratives you promote. That what appears in the News Feed may be able to shift public opinion over time, to the opinion of whoever is running the feed.

It is the suspicion that the News Feed can drive such changes that has driven so much of the company’s own research, and fears about the company’s influence, even as that possibility has been relentlessly downplayed by Facebook’s PR machine.

But now the company has decided to see for itself. To the public, it will promise it can’t possibly be as powerful as its apostate researchers say it is.

And then, with Project Amplify, Facebook will attempt to see if they might actually be right.


Elsewhere: Facebook CTO Mike “Schrep” Schroepfer announced today that he’s stepping down next year after 13 years at the company. He’ll be replaced by Andrew “Boz” Bosworth, who will continue to manage Facebook Reality Labs as well.


Governing

Apple won’t let Epic Games back into the App Store until all its appeals are exhausted, Epic CEO Tim Sweeney said. That process is expected to take years. (Russell Brandom and Adi Robertson / The Verge)

The next version of iOS will let you add your vaccination card to the Wallet app. Hopefully you’ll be able to find it amid all your leftover boarding passes from 2019. (Juli Clover / MacRumors)

Microsoft uncovered a giant phishing-as-a-service-operation. For just $800, these criminals will set up an entire phishing scam for you. I blame Moore’s Law! (Catalin Cimpanu / The Record)

The Justice Department is investigating Zoom’s acquisition of American customer service company Five9 on national security grounds. Zoom faces multiple ongoing investigations regarding data usage and its ties to China. (Kate O’Keeffe, Aaron Tilley and Dawn Lim / Wall Street Journal)

A profile of Johannes Caspar, who spent 12 years as the data protection commissioner for the German city-state of Hamburg. While in the role, he developed a reputation as one of Europe’s foremost privacy hawks, leading investigations into Facebook and others. (Cathrin Schaer / Wired)


Industry

Facebook says Apple’s App Tracking Transparency feature is hurting its business. A rare mid-quarter update to the business, though Facebook issued this warning before. (Sara Fischer / Axios)

Tim Cook says employees who leak memos do not belong at Apple, according to leaked memo.” No leaked memo is ever as satisfying as the please-don’t-leak leaked memo. (Zoe Schiffer / The Verge)

Most assets sold on OpenSea in the last 90 days haven't seen another deal since.” While NFT mania has swept the globe, the vast majority of buyers and sellers aren’t making money. (Justina Lee and Madeline Campbell / Bloomberg)

But: Dapper Labs, the company behind NBA Top Shot, raised $250 million. The company is expanding into soccer-related NFTs through a partnership with La Liga. (Vildana Hajric / Bloomberg)

Twitch reached a partnership with music publishers but stopped short of a licensing deal. “The agreement includes a financial settlement to account for Twitch's past usage of music on the platform, according to a source familiar with the matter, as well as a time window during which the two parties will negotiate an arrangement for handling music use on Twitch going forward.” (Tatiana Cirisano / Billboard)

Kids today don’t know what files are. And STEM education is being rewritten as a result. Hilarious story on a previously hidden generational divide. (Monica Chin / The Verge)


Those good tweets


Talk to me

Send me tips, comments, questions, and Facebook opinions: casey@platformer.news.