Q&A with WhatsApp's Will Cathcart
After a contentious year, candid talk on encryption, privacy, and ProPublica
It has been a contentious year for WhatsApp.
In January, a simple effort to update its terms of service to enable some commerce features triggered a massive backlash in India, helping its rival Signal to double its user base in a month. In May, the Facebook-owned messaging app sued India over new rules issued by the country’s IT ministry that could break end-to-end encryption around the globe. And just last week, a widely read report in ProPublica drew attention to the service’s use of human reviewers to investigate potential violations of WhatsApp’s terms of service, in part by reading the five most recent messages in any reported exchange.
Writing about ProPublica’s story, I took exception to the idea that allowing users to report bad actors is necessarily bad for privacy. (After publication, ProPublica added an editor’s note saying it had altered some language in the story to clarify that enable users to report abuse does not break encryption.)
A few days later, WhatsApp announced that it would begin letting you encrypt a backup of your messages, preventing anyone who doesn’t have your encryption key (or, alternatively, a password that you set) from reading the contents of any of your messages. (Alex Heath has a nice technical overview of how this works in The Verge.)
All these issues are the purview of Will Cathcart, who took over WhatsApp in March 2019. Cathcart, who joined Facebook in 2010 after a stint at Google, previously oversaw Facebook’s almighty News Feed.
The jobs are very different, but both have involved high-stakes global political battles about speech and democracy. On Monday morning, I caught up with Cathcart over Zoom about privacy, encryption, and the future of messaging. (I also asked him if he ever wished he had an easier job. “I love my job,” he assured me.)
Highlights of our conversation follow. This interview has been lightly edited for clarity and length.
Casey Newton: On Friday you announced that WhatsApp is introducing encrypted backups for its users on Android and iOS. Why?
Will Cathcart: We're always focused on what we can do to protect the privacy of people's messages. People's messages are very sensitive. And the reality is that over time there are growing threats to people's privacy — hackers, hostile governments, criminals. And so we're always looking at how can we add more privacy, especially around the theme of protecting what you say.
We've had end-to-end encryption for five years, which means if you send a message to someone on WhatsApp, we can't see what you sent as it passes through all of our servers. It's completely secure. We think that's really important. But the reality is there's other things we can do to protect people's messages. One is to actually help people's messages not live forever. We launched disappearing messages late last year, because when I talk to you in person, we don't usually keep a transcript of the conversation.
Another area we've been looking at for a while is backups. Many people don't back up their messages, but a lot of people do. And you can opt into backup onto Google or iCloud if you have an Android or an iPhone. And we wanted to see if we could find a way to add the same level of end-to-end encrypted security that you get when you send a message across WhatsApp to those backups.
How do you do it?
That is a hard problem, because if you back up your message in a truly end-to-end encrypted way, you have to have a password for it, and if you lose your password or your phone, you really have no way to get them back. We don't have some way to help you if you lose them.
So what we took a long time figuring out how to do was, how can we do this in a way that we felt would be accessible such that a lot of people would be able to use it? And what we've come up with is there's two options you can choose. One is you can keep a 64-digit key, and you can keep that yourself — you can print it out, you can write it down, or you can try to remember it, but I wouldn't recommend it. Or if that's too intimidating or too hard, which we think it will be for a lot of people, we've come up with a system where we'll store the key for you using hardware security modules, which means we don't have a way to access it. And you can come up with a shorter, easier-to-remember passphrase to get access to it. And that I think will help make this more accessible for a lot of people.
As you mentioned, in recent years we’ve seen stories about state-sponsored hackers attempting to access the WhatsApp messages of government officials, diplomats, activists, journalists, and human rights activists, among others. Are backups part of that story? Are they in the threat model?
Yes, absolutely.
In some of the stories around spyware companies, the most worrying version of this is where they get full access to your phone. But it's absolutely a threat that people could try to get access to your backups. There was just a story in the LA Times a few weeks ago about a predator who was using social engineering get access to women's backups just to try to look through their photos. There was some horrifying number of people affected by that. The reality is, people have really sensitive stuff in what they say and what they send. We think we've got to look at all the ways that there could be a threat to them, and if in any case we can find an interesting or novel way to protect it, add it.
So on one hand, WhatsApp now offers a stronger degree of protection to users here than some other encrypted messengers, like Apple’s iMessage, which doesn’t encrypt its backups. But WhatsApp came in for criticism last week over the fact that it allows users to report each other, and to include recent messages in the reports they submit. And those reports — and messages — are reviewed by humans. How did that system come about?
We've had the ability for people to report for a long time. And look, we just disagree with the criticism here. If you and I have a private conversation, that's private — [but] that doesn't mean you don't have the right to go complain to someone if I say something harassing, offensive, or dangerous. That's how the real world works: in the real world, two people can have a private conversation and then one of them can go ask for help and relay what they were told if they need to. I just think that matches how normal people communicate.
I feel like here we've really hit on how “privacy” seems to be a word that is understood differently by every person using it.
For what it's worth, in this area — I haven't heard people who use WhatsApp tell me they think the idea that we let people report is a problem. I do think there's some really hard questions around privacy and technology and where are the lines, and things like that. But this one isn't something I've seen actual people who use WhatsApp have a lot of concern about.
What are some of the ways that you feel that user reporting benefits WhatsApp?
The clearest high-level way it benefits us is it helps us run the service with reduced amounts of spam. This is a service with 2 billion users. That is a big global system. Unfortunately, there are gonna be some people who are going to try to abuse it — send out spam, send out phishing messages, send out things that are trying to make the experience for people less safe. And the fact that people can report is one of the most powerful techniques we have to catch that stuff. We're able to ban millions of accounts a month based on [those reports].
Again, we can't see the messages people send, but we can see when someone reports to us. We think it's okay for you to report a spammer. And then we can use that to ban people and help keep the service more safe.
And then there's other, more rare but very meaningful problems to try to work on — for example, the sharing of child exploitative imagery. We think we've found a way to have an end-to-end encrypted system that has the level of security people need for their private messages — but use things like reports, and some of the metadata we have, to ban people who appear to be sharing childhood exploitative imagery. And in some cases, make actual referrals to the National Center for Missing and Exploited Children. We made something like 400,000 referrals last year, and that's powered by reports. I think that's very consistent with people's model of privacy: if I send you something and you think it's a problem and you want to ask for help, you should be able to.
When I talked to ProPublica’s president about all this, he said look: at the end of the day, this company is saying that WhatsApp messages are totally private, when in fact in some cases they’re reviewed by humans. Do you think most of your users understand that dynamic, or could you do more there?
I think people get this. There’s not confusion or concern from the people who actually use WhatsApp. Anyone who uses WhatsApp can go in and hit the report button, and it gets used a lot. It's really transparent when you do that, that it's gonna send messages to us. So this whole particular criticism did surprise me.
I wrote last week that WhatApp’s encryption-plus-reporting approach seemed to be trying to find a workable middle ground in a world where encryption is under threat. The services that provide it are probably going to have to make some sort of concessions to the government. And so how do you maximally protect encryption while also enabling at least some kind of interfacing with law enforcement to catch the worst actors? Is this how you see it?
I think about it a little differently. End-to-end encryption protects all of our users. It protects them by keeping their messages secure, while on top of that, letting people tell us if someone's spamming protects our users. It's usually framed as like, “are you picking privacy or are you picking safety?” I see this as the same thing — end-to-end encryption is one of most powerful technologies we have to protect people's safety all around the world.
What is making you comfortable that on balance, the benefits of the encryption you provide outweigh any harms that may be caused by people sort of having access to these protected systems?
I would say two things. One is, I just see the trends on all the threats going on around the world. And I think through, years from now, what are the consequences if we don't have secure protection for our data? Especially in liberal societies, in a world where there's hostile governments with a very different worldview about privacy and information?
And two, one thing I find helpful is thinking through real-world analogs. A lot of stuff feels so new that the debates feel very new, but if you actually ask yourself for the real-world equivalents, they're not new.
People have been able to meet in private in person and talk privately for hundreds and hundreds of years, and there's no automatic system keeping a backup of it. There's no automatic system relaying it to a company. And I think that's been a good thing. Sometimes when you look at some of the proposals on breaking encryption, or traceability in India, or scanning for every private photo to look up against the database, and you just apply it to “hey, how would you feel about doing this in people's living rooms?” — most people have an instinctive horrified reaction. I think that's telling.
Let’s talk about the current global situation around end-to-end encryption globally. You’re suing the Indian government over new regulations that would require you to trace the originator of individual messages, and to filter messages based on what they contain. Presumably this would apply to encrypted backups as well. What’s at stake here?
With the IT rules in India, the specific thing those rules would require is us to build some system [to comply] if someone comes to us and says "Hey, someone said the words 'XYZ.' Tell us who the first person is who said the words XYZ." That's not private. And it undermines the security that end-to-end encryption provides.
I think 10 years from now even more of our lives will be online. Even more of our sensitive data will be online. There will be even more sophisticated hackers, spyware companies, hostile governments, criminals trying to get access to it. And not having the best security means that information is stolen. I think that has real consequences for free society. If journalists' information is being stolen, which we saw in some of the reporting around NSO Group, I think that undermines the free press. If people who want to organize can't communicate in private, I think that undermines their ability to advocate for change.
I think there's a lot of core tenets of democracy and liberalism that actually rely on people being able to have private information.
Say you lose in India. Does that break encryption in WhatsApp globally, or can you contain the fallout to India somehow — and maybe eventually in other countries who might adopt similar rules?
You know, I don't have a crystal ball. My hope is that over the next few years, increasingly governments will realize that on balance, the more important thing for them to do is protect their citizens’ data. That the threats are growing, and so their interest in protecting people's security is higher, and therefore they'll be dismissive of what some other countries are asking for. But I don't know.
I want to try to ask it again, though. If India says, “Sorry, Will, you lose on this one, you have to build this awful system” — can the damage be contained to India?
I think that there's a political question and there's a technical question. The way they wrote the rules, and what they've said, is that they only want it to apply it to people in India. But I think there's a broader political question.
The more some countries see other countries do it, or push for it, the more that then they want to push for it, too.
Do you ever long for the days when you had an easier job, like running the Facebook News Feed?
(Laughs) I love my job. I get that there are gonna be questions. I get that when we launch things like end-to-end encrypted backups, there are gonna be some people who criticize it. But at the end of the day I just feel so lucky to get to work on something that so many people love and use for stuff that's so important.
The Ratio
Today in news that could change public perception of the big tech companies.
⬆️ Trending up: Twitch is suing two anonymous users for “targeting black and LGBTQIA+ streamers with racist, homophobic, sexist and other harassing content” in so-called “hate raids.” It’s the latest example of Twitch using the legal system to take action against people who try to exploit its service to spread hatred and harass its users. (Cecelia D’Anastasio / Wired)
🔃 Trending sideways: Apple fired senior engineering program manager Ashley Gjøvik for allegedly violating rules against leaking confidential information. Her Twitter account is really something to see. (Zoe Schiffer / The Verge)
⬇️ Trending down: Google may owe more than $100 million in back pay to its contractor work force in 16 countries with pay parity laws. A whistleblower reported the company to the Security and Exchange Commission after discovering it had been underpaying some workers for years, often in violation of local laws. (Daisuke Wakabayashi / New York Times)
⬇️ Trending down: Facebook inadvertently sent flawed data to misinformation researchers, sullying years of work and potentially putting graduate degrees at work. A bad mistake that will cause further mistrust between academics and Facebook. Also a black eye for Social Science One, the organization set up to facilitate these data transfers. Up to 30 percent of Social Science One projects were relying on the flawed data. (Davey Alba / New York Times)
⬇️ Facebook’s “cross-check” system, designed to provide an additional layer of review to high-profile accounts, effectively created a two-tiered system of justice that regularly allowed VIPs to escape its content moderation regime. More on this tomorrow, but for now read this damning investigation and what the company’s just-departed former head of civic integrity had to say about it. (Jeff Horwitz / Wall Street Journal)
Governing
⭐ Epic appealed Friday’s ruling in its lawsuit against Apple. While the judge ruled that Apple has to let developers link to third-party payment systems, Epic lost on most of its claims. Here’s Kim Lyons at The Verge:
At trial, Epic argued Apple had a monopoly because of how it requires developers to use its payments system for in-game purchases. But Judge Yvonne Gonzalez Rogers ruled Friday that Epic should pay damages to Apple for violating rules around its in-app purchasing system, while undoing Apple’s most restrictive rules on steering customers to alternate payment systems.
Most notably, the judge found that Epic failed to make the case for Apple as a monopoly in the mobile gaming marketplace, which she ultimately found was the relevant market for the company’s claims. “The evidence does suggest that Apple is near the precipice of substantial market power, or monopoly power, with its considerable market share,” Judge Rogers wrote — but said the antitrust claims failed in part “because [Epic] did not focus on this topic.”
More links from the trial:
- The judge’s ruling could cost Apple billions of dollars, but that’s not even 1 percent of its revenue. (Mark Gurman / Bloomberg)
- And on the flip side: it could have huge benefits for a wide range of developers. (Jack Nicas and Kellen Browning)
- Apple won’t let Epic Games back into its App Store unless it agrees to play by the rules. (Sam Byford / The Verge)
- Apple’s begrudging, piecemeal changes to the App Store are only further stoking regulatory and customer backlash against the company. (MG Siegler / 500ish)
- Apple should take this opportunity to require in-app purchases for games for security reasons but allow steering to outside payments for other kinds of apps. (Ben Thompson / Stratechery)
President Biden is expected to nominate Georgetown University law professor Alvaro Bedoya to the Federal Trade Commission. Bedoya is known as a privacy hawk who has worked on issues related to mobile location data and facial recognition technology. (Margaret Harding McGill / Axios)
The Federal Election Commission dismissed a complaint against Twitter for temporarily restricting links to a New York Post article about Hunter Biden during the election. “The election commission determined that Twitter’s actions regarding the Hunter Biden article had been undertaken for a valid commercial reason, not a political purpose, and were thus allowable.” (Shane Goldmacher / New York Times)
A Dutch court ruled that Uber drivers are employees, not contractors. Uber says it will appeal. (Anthony Deutsch and Toby Sterling / Reuters)
D.C. Attorney General Karl Racine expanded his antitrust lawsuit against Amazon to include agreements with wholesalers. The amended suit “alleges that Amazon has illegally stifled competition by requiring that first-party sellers, or wholesalers, guarantee the tech giant will make a minimum profit when it buys and resells their goods as its own.” (Cristiano Lima / Washington Post)
US and European Union officials are reportedly making progress on a deal that would allow Facebook and others to continue storing and transferring data across international borders. If no deal comes together, it could represent a further, maybe irreversible splintering of the internet. (Sam Schechner and Daniel Michaels / Wall Street Journal:
US officials are discussing a formal review of whether so-called stablecoins threaten the stability of the US financial system. More good news for the Diem team! (Jesse Hamilton and Saleha Mohsin / Bloomberg)
Google appears to have resumed compliance with at least some government requests in Hong Kong, despite saying it would not cooperate after the passage of a strict national security law there last year. The company reported providing unspecified data to the government in at least three cases. (Selina Cheng / Hong Kong Free Press)
The United Kingdom is considering removing a provision from its data protection regime that grants citizens the right to request human review of some decisions made by algorithms. Supporters of the move say it will make the UK more competitive in the development of AI. (Peter Foster, Madhumita Murgia and Javier Espinoza / Financial Times)
California put the code for its digital vaccine record on Github. It’s a great product; here’s hoping other states find it useful. (Rick Klau / California Department of Technology)
Industry
⭐ Apple issued an emergency security update to address a flaw created by spyware. A dreaded “zero-click” exploit created by NSO Group and discovered by Citizen Lab could grant the attacker total access to a user’s phone. Update all your Apple devices immediately. Here’s Nicole Perlroth at the New York Times:
Apple’s security team had worked around the clock to develop a fix since Tuesday, after researchers at Citizen Lab, a cybersecurity watchdog organization at the University of Toronto, discovered that a Saudi activist’s iPhone had been infected with an advanced form of spyware from NSO.
The spyware, called Pegasus, used a novel method to invisibly infect Apple devices without victims’ knowledge. Known as a “zero click remote exploit,” it is considered the Holy Grail of surveillance because it allows governments, mercenaries and criminals to secretly break into someone’s device without tipping the victim off.
Apple intends to significantly increase its TV output next year, and will spend more than $500 million marketing it. Anything to erase the memory of Ted Lasso season two. (Jessica Toonkel / The Information)
Related: An analysis of Apple TV+ finds that its original shows are laden with product placement for Apple devices. (Kenny Wassus / Wall Street Journal)
Match Group named Renate Nyborg CEO of Tinder after Jim Lanzone left to run Yahoo. Nyborg is the first woman to run Tinder; among other qualifications, she met her husband on the app. (Jackie Davalos / Bloomberg)
A peer-reviewed study of more than 60,000 Microsoft workers during the first six months of 2020 found that pandemic-related work-from-home demands reduced collaboration and made communication more difficult. I’m surprised it was published in Nature and not the esteemed journal Duh. (Todd Bishop / GeekWire)
YouTube forced a second popular music-playing bot off Discord for violating its agreements with record labels. The real question is how it stayed up long enough to be used by (reportedly!) 560 million people. (Tom Warren / The Verge)
You can now spend $475 for an NFT ticket to an in-real-life party in which you can see the first authorized showing of the NFT that Beeple sold for $69 million earlier this year. “The ticket also includes one drink.” (Jacob Kastrenakes / The Verge)
Those good tweets
Talk to me
Send me tips, comments, questions, and encrypted backups: casey@platformer.news.