Facebook rethinks COVID misinformation
The company has banned anti-vaxx content for two years. Now it wants a second opinion
Today, let’s talk about a settled question that Meta has decided to re-open: what should the company do about misinformation related to COVID-19?
Since the earliest days of the pandemic, Meta has sought to remove false claims about the disease from Facebook and Instagram. And for just as long, the company has faced criticism that it hasn’t done a very good job. A year ago this month, asked about the role “platforms like Facebook” played in spreading misinformation about the disease, President Biden said “they’re killing people” — though he walked his statement back a day later.
Still, Biden voiced a fear that is deeply held among Meta critics: that the platform’s huge user base and algorithmic recommendations often combine to help fringe conspiracy theories reach huge mainstream audiences, promoting vaccine hesitancy, resistance to wearing masks, and other public health harms.
The pandemic is not close to over — an estimated 439 people died of COVID in the past day, up 34 percent in the past two weeks. And highly infectious Omicron subvariants continue to tear through the country, raising fears of a surge in cases of long COVID — a condition that experts say has already been “a mass disabling event.” An estimated 1 in 13 American adults reported having long COVID symptoms earlier this month, according to the U.S. Centers for Disease Control and Prevention.
Despite that, Meta is now considering whether to relax some of the restrictions it has placed on COVID-related misinformation, including whether to continue removing posts about false claims about vaccines, masks, social distancing, and related subjects. It has asked the Oversight Board — an independent group funded by Meta to help it make difficult calls relating to speech — for an advisory opinion on how to proceed.
Nick Clegg, the company’s president of global affairs, explained Tuesday in a blog post:
In many countries, where vaccination rates are relatively high, life is increasingly returning to normal. But this isn’t the case everywhere and the course of the pandemic will continue to vary significantly around the globe — especially in countries with low vaccination rates and less developed healthcare systems. It is important that any policy Meta implements be appropriate for the full range of circumstances countries find themselves in.
Meta is fundamentally committed to free expression and we believe our apps are an important way for people to make their voices heard. But some misinformation can lead to an imminent risk of physical harm, and we have a responsibility not to let this content proliferate. The policies in our Community Standards seek to protect free expression while preventing this dangerous content. But resolving the inherent tensions between free expression and safety isn’t easy, especially when confronted with unprecedented and fast-moving challenges, as we have been in the pandemic. That’s why we are seeking the advice of the Oversight Board in this case. Its guidance will also help us respond to future public health emergencies.
For all the criticism Meta has received over its enforcement of health misinformation, by some measures the steps it took clearly had a positive effect on the platform. The company estimates it has taken down more than 25 million posts under its stricter policies, which now require the removal of 80 separate false claims about the disease and its vaccines.
At the same time, the platform arguably has at times overreached. In May 2021, I wrote about Meta’s decision to reverse an earlier ban on discussing the possibility that COVID-19 leaked from a Chinese lab. The company made that decision amidst a spike in hateful violence against Asian people, fearing that conspiracy theories related to the disease’s origin could be used to justify further attacks.
But as debate about the virus’ origin intensified, Meta began allowing people to speculate again. (To date, no consensus on the issue has emerged.) I wrote at the time that the company probably should not have taken a position on the issue in the first place, instead using its existing hate-speech policies to moderate racist posts:
I generally favor an interventionist approach when it comes to conspiracy theories on social networks: given the harm done by adherents to QAnon, Boogaloo, and other extremist movements, I see real value in platforms reducing their reach and even removing them entirely.
On some questions, though, platform intervention may do more harm than good. Banning the lab-leak hypothesis gave it the appearance of forbidden knowledge, when acknowledging the reality — that it is unlikely, but an open question — may have been just dull enough to prevent it from catching fire in those fever swamps.
Last week, I asked Clegg why the company had decided to ask the board for a second opinion on health misinformation now. One, he said, Meta assumes there will be future pandemics that bring with them their own set of policy issues. The company wants to get some expert guidance now so it can act more thoughtfully the next time around. And two, he said, the Oversight Board can take months to produce an opinion. Meta wanted to get that process started now.
But more than anything, he said, the company wanted a check on its power — to have the board, with which this month it signed a new three-year, $150 million operating deal, weigh in on what have been some fairly stringent policies.
“This was a very dramatic extension of our most exacting sanction,” Clegg told me. “We haven't done it on this scale in such a short period of time before. … If you have awesome power, it is all the more important that you exercise that awesome power thoughtfully, accountably, and transparently. It would be curious and eccentric, in my view, not to refer this to the Oversight Board.”
Indeed, weighing in on policies like this is one of the two core duties of the board. The primary duty is to hear appeals from users who believe their posts should be restored after being removed, or taken down after being left up in error. When the board takes those cases, its decisions are binding, and Meta has so far always honored its findings.
The board’s other key duty is to offer opinions on how Meta ought to change its policies. Sometimes it attaches those opinions to decisions in individual cases; other times, as with the COVID policies, Meta asks the board about something. Unlike cases about single posts, the board’s opinions here aren’t binding — but to date, Meta has adopted roughly two-thirds of the changes the board has proposed.
Some people continue to write the board off anyway. Since even before it began hearing cases in 2020, the board has been subject to withering complaints from critics who argue that it serves as little more than a public-relations function for a company so beleaguered it had to change its name last year.
And yet it’s also clear that Meta and other social platforms have a profound need for the kind of rudimentary justice system a board like this can provide. In its first year, the board received 1.1 million appeals from Meta’s users. Before the board existed, they had no recourse when Facebook made a mistake beyond some limited automated systems. And every tough question about speech was ultimately made by one person — Mark Zuckerberg — with no room for appeal.
It seems obvious to me that a system where these cases are heard by an expert panel, rather than a lone CEO, is superior, even if it still leaves much to be desired.
So what happens now?
One possibility is that Meta’s policy teams want to relax restrictions on speech related to COVID policy, but want the cover that a decision from the Oversight Board would give them. They have reason to believe the board might come to that conclusion: it was stocked with free-speech advocates, and generally when they have ruled against Meta it has been in the name of restoring posts that the board believes were wrongfully removed.
That said, the company will also likely be in for a drubbing from left-leaning politicians and journalists, along with some number of users, if the board gives them the go-ahead to relax its policies and the company does so. Clegg told me that, should that happen, Facebook and Instagram would use other measures to reduce the spread of misinformation — adding fact-checks, for example, or reducing the distribution of false posts in feeds. But the mere existence of anti-vaxx content on Meta will lead to new criticism — and possibly new harms.
Another possibility is that the board won’t take the bait. Members could argue that removing health misinformation, while a drastic step, continues to be a necessary one — at least for now. The board remains relatively new, and mostly unknown to the general public, and I wonder what appetite members have to stand up for people’s right to spread lies about vaccines.
Whatever the board decides, Clegg said, Meta will move cautiously with any changes. At the same time, he said, the company wants to be judicious in how it deletes user posts.
“I think you should deploy the removal sanction very carefully,” he said. “You should set the bar really high. You don't want private-sector companies to be removing stuff unless it really is demonstrably related to imminent, real-world harm.”
Elsewhere at the Oversight Board: The board accepted a case from a US-based couple who identify as transgender and non-binary and who discussed upcoming surgery; Meta removed the posts for allegedly violating its policy against sexual solicitation. The board will also hear a case involving the common and problematic practice of law enforcement agencies to request that posts be removed from Facebook, not for breaking the law but for violating Facebook’s policies.
Meta prepares for war
On Tuesday, a new Meta legend was born. His name is Gary from Chicago, and after CEO Mark Zuckerberg delivered an impassioned address to employees recently telling them they would need to work harder to rebuild their business and catch up with competitors, Gary asked the first prerecorded question.
Alex Heath and David Pierce tell the rest in The Verge:
“Hi there,” the first prerecorded employee question started. “I’m Gary, and I’m located in Chicago.” His question: would Meta Days — extra days off introduced during the pandemic — continue in 2023?
Zuckerberg appeared visibly frustrated. “Um… all right,” he stammered. He’d just explained that he thought the economy was headed for one of the “worst downturns that we’ve seen in recent history.” He’d already frozen hiring in many areas. TikTok was eating their lunch, and it would take over a year and a half before they had “line of sight” to overtaking it.
And Gary from Chicago was asking about extra vacation days?
“Given my tone in the rest of the Q&A, you can probably imagine what my reaction to this is,” Zuckerberg said. After this year, Meta Days were canceled.
Meta Days are canceled, Gary, and the news gets worse from there. “Realistically, there are probably a bunch of people at the company who shouldn’t be here,” Zuckerberg said at the June 30 meeting. Later, he convened his top deputies for an extended “work-a-thon” to develop a response to the company’s recent reversal of fortune, Mike Isaac reported at the New York Times.
Meta reports earnings Wednesday, and many are predicting the company’s first-ever decline in revenue. Ahead of that news, Insider reports that employees are expecting that up to 10 percent of head count could be reduced. Meta also said Tuesday it would raise the price of the 2-year-old Quest 2 virtual reality headset by $100, to $399, in an effort to lose less money on the device.
For his part, Zuckerberg also sold his San Francisco house.
Meanwhile, Instagram chief Adam Mosseri posted a video in which he acknowledged that the company’s video recommendations aren’t very good yet. But while he sought to reassure users in the wake of Kylie Jenner’s criticisms earlier in the week, Mosseri reiterated that the future of Instagram is video — and algorithmically recommended video — like it or not.
In Axios, Scott Rosenberg wondered if all this represents “the sunset of the social network.” It certainly feels like the sunset of something.
Governing
- ByteDance’s now-defunct English language news app, TopBuzz, pushed pro-China content on American users, according to four former employees. The company strongly denied the charges, but expect this to fuel a new debate over the future of TikTok. (Emily Baker-White / BuzzFeed)
- The Securities and Exchange Commission is investigating whether Coinbase “improperly let Americans trade digital assets that should have been registered as securities,” causing its stock to sink 9.2 percent. (Allyson Versprille and Lydia Beyoud / Bloomberg)
- Elon Musk’s lawyers sent a lawyer to the Delaware Chancery Court saying that Twitter is making its trial preparations impossible. It’s not a persuasive letter. But they did send it! (David Pierce / The Verge)
- Here’s a nice, graphics-driven piece on the potential impact of the American Innovation and Choice Act, which would reshape competition among big tech platforms. (Leah Nylen / Bloomberg)
- Congress is considering a bill that would ban contracts with foreign companies that make spyware and allow the president to impose sanctions on firms that target intelligence agencies with it. (Suzanne Smalley / CyberScoop)
- A bill that would regulate stablecoins has been delayed “for at least several weeks.” (Andrew Ackerman / Wall Street Journal)
- The United Kingdom’s Competition and Markets Authority expressed general satisfaction with the state of competition in the music streaming market, which … what? (Katharine Gemmell and Hugo Miller / Bloomberg)
- Apple, Google, Microsoft, and Amazon have been buying gold illegally mined in Brazil, according to a new report. (Filipe Espósito / 9to5Mac)
- A college student whose nudes were stolen got revenge on her attacker by tricking him into revealing his IP address, allowing police to make an arrest. He had targeted 300 women this way; Snapchat still hasn’t helped this woman recover her account from her attacker. (Jeff Stone / Bloomberg)
Industry
- Google reported its slowest quarterly revenue growth in two years amid a pullback in spending by some advertisers. (Miles Kruppa / Wall Street Journal)
- YouTube revenue was up 4.8 percent, also reflecting the slowest growth in two years. (Todd Spangler / Variety)
- Microsoft missed analyst estimates for the first time since 2016 and posted its slowest earnings growth since 2020. (Jordan Novet / CNBC)
- “Twitter said that it would hold a shareholder meeting to vote on the company’s $44 billion acquisition by Elon Musk on September 13.” (Jonathan Vanian / CNBC)
- Shopify said it would lay off about 1,000 workers, representing 10 percent of its workforce, as e-commerce growth slows dramatically. (Vipal Monga / Wall Street Journal)
- Amazon will raise Prime prices in Europe for the first time, increasing the annual fee by up to 43 percent a year. (Jeffrey Dastin / Reuters)
- Helium, a highly touted web3 company building a network of wireless devices incentivized tokens, may be generating as little as $6,500 a month in revenue, according to this tweeted analysis. (Liron Shapira)
- Meta is exploring a potential acquisition of AdHawk Microsystems, which makes eye-tracking technology for augmented and virtual reality headsets. (Mark Gurman and Liana Baker / Bloomberg)
Those good tweets
Talk to me
Send me tips, comments, questions, and COVID misinformation: casey@platformer.news.