An overdue idea for making the internet safer just got the funding it needs
How a new initiative from Google, OpenAI, Discord and others could transform trust and safety
![An overdue idea for making the internet safer just got the funding it needs](/content/images/size/w1200/2025/02/image.webp)
Recently when I’ve written about efforts to make platforms safer, it has been in the context of those efforts seeming to fail. Starting in 2023, platforms laid off so many people working on safety that I wondered whether we had passed peak trust and safety. More recently, Meta’s decision to stop using automated systems that once identified harassment, misinformation, and other harms served as further evidence that platforms are retreating from safety efforts.
As someone who believes that effective content moderation is a prerequisite for building good social products and promoting actual free expression, I’ve been depressed by platforms’ reactionary turn away from building them. But then last week, I learned of an initiative that might actually begin to reverse this slide — and some big platforms are actually supporting it.
It’s called ROOST — that stands for robust open online safety tools — and it launched at the Artificial Intelligence Action Summit in Paris today with $27 million in funding from Google, OpenAI, Discord, Roblox and others. It’s a bid to revolutionize trust and safety with open-source tools in the same way that cybersecurity was similarly transformed in the 1980s and 90s — and the commitments it has received from platforms to date suggest that it could well succeed.
I.
In 2018, Twitter acquired a four-year-old startup named Smyte for an undisclosed sum.
Smyte, which was founded by engineers who worked at Google and Instagram, pitched itself as “trust and safety as a service.” Clients like ZenDesk, IndieGogo, and GoFundMe plugged into its API, and Smyte scanned the content uploaded to their services for abuse, harassment, spam, and other harms.
Around that time, Silicon Valley was learning that any platform that allows users to upload text or images has a need for something like Smyte. But few third-party tools were available, and companies were often reluctant to build these services themselves: they typically lay outside platforms’ core expertise, and require engaging with the worst that humanity has to offer.
And so it made all the sense in the world for Twitter to acquire Smyte. But the move was terrible for Smyte’s customers: Twitter shut off clients’ access to the service 30 minutes after announcing the acquisition, according to TechCrunch. And those companies had to once again scramble to find a solution.
Big platforms like Google and Meta had more money and talent to throw at the problem. But the industry had not yet decided on a standard set of tools and services to build. The result was that trust and safety engineers would often find themselves asked to build the same tool over and over again at different jobs, often only half-finishing it before being reassigned.
“When we interviewed trust and safety engineers and product managers for this project, most shared their frustration with having to build crucial safety tools from scratch at every company, as there are no reliable open source components to start from,” said Juliet Shen, a former product leader at Snap, Grindr, and Google who is now ROOST’s head of product. “We've heard this story many times, and I've lived it: whether it's home brewed or a custom-tuned vendor solution, teams … often end up at the bottom of the metaphorical hill again if they change teams or move companies.”
The cybersecurity field followed a similar trajectory. In the 1980s and early 1990s, most companies built their own security solutions from scratch. But the ‘90s also saw the rise of Linux: an open-source operating system that companies could implement and modify however they wanted to, for free.
Within a few years, a series of popular, open-source security tools had been released: like Snort, for detecting network intrusions, or OpenSSL, for encrypting data in transit.
At first, it seemed counterintuitive for companies to adopt open-source security tools: hackers could just as easily read the source code as anyone else. But in practice, having lots of eyes on security software has mostly made us safer: vulnerabilities are identified more quickly, patches can be applied more quickly, and the low cost means that more companies use it.
A handful of trust and safety tools have been available across the industry, such as PhotoDNA, a tool from Microsoft that helps platforms identify child sexual abuse material; or Meta’s Hasher-Matcher-Actioner tool for identifying copies of violating posts. For the most part, though, platforms have been on their own.
Clint Smith felt that need acutely. Smith is the chief legal officer of Discord, which had been a Smyte customer before it shut down. Suddenly, the company had to build its own safety stack from scratch.
“The things we needed to build had already been built by Snap, Reddit, and other companies,” Smith said. “But all of them were proprietary implementations done by those companies’ engineering teams. We asked ourselves, why can’t we have a common set of tools?”
II.
ROOST represents the most significant effort to date to build that common set of tools.
It’s a nonprofit organization that will build trust and safety tools and make them freely available to its partners. Its partners, in turn, have agreed to use them and contribute code back into the ecosystem.
To start with, ROOST is tackling three categories of tools. One set will identify, remove, and report CSAM. Another will build classifiers to identify harms in various mediums: Roblox has already contributed an open-source model for detecting abuse, racism, and sexual content in audio clips. The final category is pure infrastructure: consoles for content moderators to review posts that include wellness features for reducing the mental health burden of doing the work, for example,
ROOST “addresses a critical need to accelerate innovation in online child safety and AI by providing small companies and nonprofits access to technologies they currently lack,” said Eric Schmidt, the former Google CEO and founding partner of the organization, in a statement.
And at a time when even the notion of content moderation is under deep suspicion — and legal threat — ROOST is not creating a set of open-source content guidelines. The nonprofit is only making tools; platforms still have to come up with the rules themselves.
It will take time for this effort to begin bearing fruit. And as impressive as ROOST’s list of partners is, it’s notable how many social platforms aren’t on it: Meta, Reddit, and Snap among them.
But open source took time to transform cybersecurity as well — and did so largely without the benefit of multimillion-dollar industry partnerships.
Smith told me ROOST is an effort to get beyond white papers and start getting tools into people’s hands.
“ROOST is here to develop and deliver tangible products, to produce and deploy software, and to keep real people safe,” he said. “In a world full of PDFs, we want to be shipping code into production.”
Elsewhere in content moderation: Discord introduced a feature to let users ‘ignore’ other users. (Anna Washenko / Engadget)
![](https://www.platformer.news/content/images/2024/05/floating_linebreak_600px-1.png)
The DOGE coup
- DOGE staffers’ moves to shut down the Consumer Financial Protection Bureau is coming amid Elon Musk’s plans for a digital wallet — and the threat that CFPB would oversee that effort. The CPFB was created by Congress and cannot legally be dissolved by an executive order. (Jason Leopold and Evan Weinberger / Bloomberg)
- DOGE is reportedly developing an AI chatbot named “GSAi” for the US General Services Administration. (Paresh Dave, Zoë Schiffer and Makena Kelly / Wired)
- Musk said he will rehire Marko Elez, a DOGE staffer who resigned after a deleted X account advocating for racism and eugenics resurfaced, after Donald Trump and JD Vance endorsed the idea. (Dan Mangan and Kevin Breuninger / CNBC)
- A teen DOGE staffer, Edward Coristine, was fired from an internship at a cybersecurity firm after being accused of leaking proprietary information. (Jason Leopold, Margi Murphy, Sophie Alexander, Jake Bleiberg and Anthony Cormier / Bloomberg)
![](https://www.platformer.news/content/images/2024/05/floating_linebreak_600px-1.png)
Governing
- Trump has reportedly put JD Vance and national security adviser Michael Waltz in charge of overseeing a potential TikTok sale. (Ben Brody, John Bresnahan and Jake Sherman / Punchbowl News)
- Musk said he isn't interested in buying TikTok. (Shelly Banjo / Bloomberg)
- Oracle, Amazon and Microsoft have reportedly indicated interest in a deal to buy TikTok, while TikTok is pushing instead for a plan that it says would wall off American users’ data. (Jessica Toonkel, Dana Mattioli, Alex Leary and Josh Dawsey / Wall Street Journal)
- A look at how Sam Altman has managed to curry favor with Trump despite his beef with Musk. (Cecilia Kang and Cade Metz / New York Times)
- Trump resolved his legal fight against X over his ban after the Jan. 6 insurrection, but shared no details. Maybe it had something to do with X's owner spending hundreds of millions of dollars to get him elected president. (Zoe Tillman / Bloomberg)
- Mentions of trans children were removed from The National Center for Missing and Exploited Children’s website. Reports say the center was ordered by the government to remove LGBTQ+ references or risk losing funding. This is egregious given how many LGBTQ youth suffer from online predation; they actually used those resources. (Adi Robertson / The Verge)
- An investigation into how ad systems from Google, Amazon and Microsoft have funneled money to a website hosting images of child sex abuse. (Thomas Germain / BBC)
- Meta torrented at least 35.7 terabytes of data from pirated books on Z-Library and LibGen after torrenting 80.6 terabytes of data from LibGen previously, new unredacted emails show. From time to time we are reminded why so may people root against this company and this is one of those times! (Ashley Belanger / Ars Technica)
- Meta is reportedly reducing its privacy teams’ ability to slow product releases, citing delayed launches and high-profile breaches. (Kalley Huang / The Information)
- Apple’s lawsuit against a former software engineer for leaking confidential information to the press was dismissed after a settlement agreement. The engineer also apologized in a post on X. (Ryan Christoffel / 9to5Mac)
- The majority of people in the United Kingdom would support stricter regulation of AI, including a law that would require AI companies to prove their systems are safe before release, a new survey showed. (Billy Perrigo / Time)
- UK security officials have reportedly ordered Apple to allow them access to encrypted content that any Apple user worldwide has uploaded to the cloud. This is awful — and if successful, the beginning of the end for end-to-end encryption. (Joseph Menn / Washington Post)
- The order comes under the “Snoopers’ Charter,” a controversial surveillance law that allows law enforcement to get data that is otherwise inaccessible even to Apple. (Tim Bradshaw, Lucy Fisher and John Paul Rathbone / Financial Times)
- X must let researchers use its information to track the spread of misinformation and disinformation ahead of Germany’s national election this month, a German court ruled. (Thomas Escritt / Reuters)
- French prosecutors have opened an investigation into X after allegations that the platform’s biased algorithm distorted the operation of an automated data processing system. (Reuters)
- French president Emmanuel Macron announced that the country will invest €109 billion into AI over the next several years. (Melissa Heikkilä, Leila Abboud and Antoine Gara / Financial Times)
- The French government is also reportedly pledging a gigawatt of nuclear power for a new AI project expected to cost tens of billions of dollars. (Sam Schechner and Asa Fitch / Wall Street Journal)
- The Japanese government has reportedly asked Apple and Google to block five unregistered crypto exchange apps – Bytbit, MEXC Global, LBank Exchange, KuCoin and Bitget. (James Hunt / The Block)
- A look at how Google Maps has struggled to accurately map roads in India and how it has been blamed for causing fatal accidents — though experts suggest the issue isn’t specific to Maps. (Ananya Bhattacharya / Rest of World)
![](https://www.platformer.news/content/images/2024/05/floating_linebreak_600px-1.png)
Industry
- Musk is leading a group of investors in a bid to buy the nonprofit controlling OpenAI for $97.4 billion. Altman replied on X: “no thank you but we will buy twitter for $9.74 billion if you want.” Hard to tell is this is just legal maneuvering or the start of something significant. (Jessica Toonkel and Berber Jin / Wall Street Journal)
- ChatGPT’s new Deep Research feature can accurately summarize recent court rulings but ignores key legal background, this test found. (Adi Robertson / The Verge)
- ChatGPT users will now see part of the chatbot’s thought process as it answers questions, in response to the popularity of DeepSeek's chain of thought previews, OpenAI said. (Kyle Wiggers / TechCrunch)
- OpenAI is reportedly finalizing its design for its first in-house chip in the next few months and sending it for fabrication at TSMC. (Anna Tong, Max A. Cherney and Krystal Hu / Reuters)
- A look at OpenAI’s $14 million Super Bowl ad, which deliberately avoided mentioning AGI or superintelligence. It was ... fine? (Kylie Robison / The Verge)
- Sam Altman makes a case for exponentially increasing investment to develop AGI and offers a view how the tech could change society. (Sam Altman)
- Former OpenAI chief scientist Ilya Sutskever’s startup, Safe Superintelligence, is reportedly in talks for a funding round that would value it at at least $20 billion. Usually you have to release a product to be worth $20 billion. But you know. (Kenrick Cai, Crystal Hu and Anna Tong / Reuters)
- General Catalyst is reportedly in talks to join Menlo Ventures and MGX in Anthropic’s latest funding round, which is set to exceed $2 billion. (Katie Roof, Shirin Ghaffary and Kate Clark / Bloomberg)
- AI is typically used more as a collaborator than an autonomous helper in the workplace, Anthropic’s Economic Index suggests. But will the percentages hold? (Scott Rosenberg / Axios)
- A look at how DeepSeek cites news sources in the absence of licensing agreements with major publishers. (Andrew Deck / NiemanLab)
- DeepSeek is more likely to produce instructions for potentially dangerous things when compared to its American competitors, AI safety experts say. But I thought open source was always safer? (Sam Schechner / Wall Street Journal)
- The rise of DeepSeek is showing how some of China’s top AI talent is choosing to stay home amid visa obstacles and high living expenses. (Viola Zhou / Rest of World)
- DeepSeek’s claim that it spent under $6 million to develop its AI model is “exaggerated and a little bit misleading,” Google DeepMind CEO Demis Hassabis said. (Yazhou Sun and Tom Mackenzie / Bloomberg)
- TikTok encouraged US Android users to sideload its app, as it has yet to reappear on the Google Play Store and Apple App Store. (Mariella Moon / Engadget)
- Meta is partnering with UNESCO for the Language Technology Partner Program, which it says will collect speech recordings and transcriptions to help develop openly available AI. (Kyle Wiggers / TechCrunch)
- Meta has developed a brain-typing system that can interpret thoughts with up to 80 percent accuracy, just not for commercial use. (Antonio Regalado / MIT Technology Review)
- Significant cultural events like Pride Month and Black History Month are no longer highlighted by default on Google Calendar. Google said manually adding events “wasn’t scalable or sustainable.” It just happened to notice this the moment Trump took office. (Jay Peters / The Verge)
- Google DeepMind said its AlphaGeometry2 system can solve 84 percent of all geometry problems in the International Mathematical Olympiad over the last two decades and outperform the average gold medalist. (Kyle Wiggers / TechCrunch)
- A look at the biggest podcasts Spotify is missing in its push towards video. (Ashley Carman / Bloomberg)
- Tinder is planning on rolling out AI-powered features for matching amid a decline in users. Not sure filling the app with slop is going to rekindle the magic. (Sarah Perez / TechCrunch)
- G42-backed AI chip firm Cerebras Systems has partnered with French AI startup Mistral to achieve a speed record in chatbot responses, it said. (Stephen Nellis / Reuters)
- A look at how GoFundMe raised more than $250 million for victims of the Los Angeles, fires and how the company is addressing criticism of fairness and concerns of racial disparities. (Ken Bensinger and Jeremy Singer-Vine / New York Times)
- Big Tech companies are set to spend more than $320 billion in AI investments this year to build data centers with specialized chips. (Stephen Morris and Rafe Uddin / Financial Times)
- Auction house Christie’s is holding its first show with only works created by AI. Nobody tell Ted Chiang! (Kyle Wiggers / TechCrunch)
- Unemployment in the IT sector rose to 5.7 percent in January, above the overall rate of 4% percent, in a sign that AI is beginning to replace workers. (Belle Lin / Wall Street Journal)
- A look at the difficult decisions over how much authors should get paid to license their books to AI companies. Microsoft offered $5,000 per title, which isn't bad, but the publisher keeps half. (Alice Robb / Bloomberg)
![](https://www.platformer.news/content/images/2024/05/floating_linebreak_600px-1.png)
Those good posts
For more good posts every day, follow Casey’s Instagram stories.
![](https://www.platformer.news/content/images/2025/02/Screenshot-2025-02-10-at-3.29.57-PM.png)
(Link)
![](https://www.platformer.news/content/images/2025/02/Screenshot-2025-02-10-at-3.32.43-PM.png)
(Link)
![](https://www.platformer.news/content/images/2025/02/Screenshot-2025-02-10-at-3.33.08-PM.png)
(Link)
![](https://www.platformer.news/content/images/2025/02/Screenshot-2025-02-10-at-3.33.33-PM.png)
(Link)
![](https://www.platformer.news/content/images/2025/02/Screenshot-2025-02-10-at-3.31.47-PM.png)
(Link)
![](https://www.platformer.news/content/images/2024/05/floating_linebreak_600px-1.png)
Talk to us
Send us tips, comments, questions, and open-source safety tools: casey@platformer.news. Read our ethics policy here.