An overdue idea for making the internet safer just got the funding it needs

How a new initiative from Google, OpenAI, Discord and others could transform trust and safety

An overdue idea for making the internet safer just got the funding it needs
(ROOST)

Recently when I’ve written about efforts to make platforms safer, it has been in the context of those efforts seeming to fail. Starting in 2023, platforms laid off so many people working on safety that I wondered whether we had passed peak trust and safety. More recently, Meta’s decision to stop using automated systems that once identified harassment, misinformation, and other harms served as further evidence that platforms are retreating from safety efforts.

As someone who believes that effective content moderation is a prerequisite for building good social products and promoting actual free expression, I’ve been depressed by platforms’ reactionary turn away from building them. But then last week, I learned of an initiative that might actually begin to reverse this slide — and some big platforms are actually supporting it.

It’s called ROOST — that stands for robust open online safety tools — and it launched at the Artificial Intelligence Action Summit in Paris today with $27 million in funding from Google, OpenAI, Discord, Roblox and others. It’s a bid to revolutionize trust and safety with open-source tools in the same way that cybersecurity was similarly transformed in the 1980s and 90s — and the commitments it has received from platforms to date suggest that it could well succeed. 

I.

In 2018, Twitter acquired a four-year-old startup named Smyte for an undisclosed sum.

Smyte, which was founded by engineers who worked at Google and Instagram, pitched itself as “trust and safety as a service.” Clients like ZenDesk, IndieGogo, and GoFundMe plugged into its API, and Smyte scanned the content uploaded to their services for abuse, harassment, spam, and other harms.

Around that time, Silicon Valley was learning that any platform that allows users to upload text or images has a need for something like Smyte. But few third-party tools were available, and companies were often reluctant to build these services themselves: they typically lay outside platforms’ core expertise, and require engaging with the worst that humanity has to offer.

And so it made all the sense in the world for Twitter to acquire Smyte. But the move was terrible for Smyte’s customers: Twitter shut off clients’ access to the service 30 minutes after announcing the acquisition, according to TechCrunch. And those companies had to once again scramble to find a solution.

Big platforms like Google and Meta had more money and talent to throw at the problem. But the industry had not yet decided on a standard set of tools and services to build. The result was that trust and safety engineers would often find themselves asked to build the same tool over and over again at different jobs, often only half-finishing it before being reassigned.

“When we interviewed trust and safety engineers and product managers for this project, most shared their frustration with having to build crucial safety tools from scratch at every company, as there are no reliable open source components to start from,” said Juliet Shen, a former product leader at Snap, Grindr, and Google who is now ROOST’s head of product. “We've heard this story many times, and I've lived it: whether it's home brewed or a custom-tuned vendor solution, teams … often end up at the bottom of the metaphorical hill again if they change teams or move companies.”

The cybersecurity field followed a similar trajectory. In the 1980s and early 1990s, most companies built their own security solutions from scratch. But the ‘90s also saw the rise of Linux: an open-source operating system that companies could implement and modify however they wanted to, for free.

Within a few years, a series of popular, open-source security tools had been released: like Snort, for detecting network intrusions, or OpenSSL, for encrypting data in transit.

At first, it seemed counterintuitive for companies to adopt open-source security tools: hackers could just as easily read the source code as anyone else. But in practice, having lots of eyes on security software has mostly made us safer: vulnerabilities are identified more quickly, patches can be applied more quickly, and the low cost means that more companies use it.

A handful of trust and safety tools have been available across the industry, such as PhotoDNA, a tool from Microsoft that helps platforms identify child sexual abuse material; or Meta’s Hasher-Matcher-Actioner tool for identifying copies of violating posts. For the most part, though, platforms have been on their own.

Clint Smith felt that need acutely. Smith is the chief legal officer of Discord, which had been a Smyte customer before it shut down. Suddenly, the company had to build its own safety stack from scratch.

“The things we needed to build had already been built by Snap, Reddit, and other companies,” Smith said. “But all of them were proprietary implementations done by those companies’ engineering teams. We asked ourselves, why can’t we have a common set of tools?”

II.

ROOST represents the most significant effort to date to build that common set of tools. 

It’s a nonprofit organization that will build trust and safety tools and make them freely available to its partners. Its partners, in turn, have agreed to use them and contribute code back into the ecosystem. 

To start with, ROOST is tackling three categories of tools. One set will identify, remove, and report CSAM. Another will build classifiers to identify harms in various mediums: Roblox has already contributed an open-source model for detecting abuse, racism, and sexual content in audio clips. The final category is pure infrastructure: consoles for content moderators to review posts that include wellness features for reducing the mental health burden of doing the work, for example, 

ROOST “addresses a critical need to accelerate innovation in online child safety and AI by providing small companies and nonprofits access to technologies they currently lack,” said Eric Schmidt, the former Google CEO and founding partner of the organization, in a statement. 

And at a time when even the notion of content moderation is under deep suspicion — and legal threat — ROOST is not creating a set of open-source content guidelines. The nonprofit is only making tools; platforms still have to come up with the rules themselves. 

It will take time for this effort to begin bearing fruit. And as impressive as ROOST’s list of partners is, it’s notable how many social platforms aren’t on it: Meta, Reddit, and Snap among them. 

But open source took time to transform cybersecurity as well — and did so largely without the benefit of multimillion-dollar industry partnerships. 

Smith told me ROOST is an effort to get beyond white papers and start getting tools into people’s hands.  

“ROOST is here to develop and deliver tangible products, to produce and deploy software, and to keep real people safe,” he said. “In a world full of PDFs, we want to be shipping code into production.”


Elsewhere in content moderation: Discord introduced a feature to let users ‘ignore’ other users. (Anna Washenko / Engadget)

The DOGE coup

Governing

Industry

Those good posts

For more good posts every day, follow Casey’s Instagram stories.

(Link)

(Link)

(Link)

(Link)

(Link)

Talk to us

Send us tips, comments, questions, and open-source safety tools: casey@platformer.news. Read our ethics policy here.