London Daily

Focus on the big picture.
Friday, Jan 09, 2026

After The US Election, Key People Are Leaving Facebook And Torching The Company In Departure Notes

After The US Election, Key People Are Leaving Facebook And Torching The Company In Departure Notes

A departing Facebook employee said the social network's failure to act on hate speech “makes it embarrassing to work here.”

On Wednesday, a Facebook data scientist departed the social networking company after a two-year stint, leaving a farewell note for their colleagues to ponder. As part of a team focused on “Violence and Incitement,” they had dealt with some of the worst content on Facebook, and they were proud of their work at the company.

Despite this, they said Facebook was simply not doing enough.

“With so many internal forces propping up the production of hateful and violent content, the task of stopping hate and violence on Facebook starts to feel even more sisyphean than it already is,” the employee wrote in their “badge post,” a traditional farewell note for any departing Facebook employee. “It also makes it embarrassing to work here.”

The departing employee declined to speak with BuzzFeed News but asked that they not be named for fear of abuse and reprisal.

Using internal Facebook data and projections to support their points, the data scientist said in their post that roughly 1 of every 1,000 pieces of content — or 5 million of the 5 billion pieces of content posted to the social network daily — violates the company’s rules on hate speech. More stunning, they estimated using the company’s own figures that, even with artificial intelligence and third-party moderators, the company was “deleting less than 5% of all of the hate speech posted to Facebook.” (After this article was published, Facebook VP of integrity Guy Rosen disputed the calculation, saying it "incorrectly compares views and content." The employee addressed this in their post and said it did not change the conclusion.)

The sentiments expressed in the badge post are hardly new. Since May, a number of Facebook employees have quit, saying they were ashamed of the impact the company was having on the world or worried that the company’s inaction in moderating hate and misinformation had led to political interference, division, and bloodshed. Another employee was fired for documenting instances of preferential treatment of influential conservative pages that repeatedly spread false information.

"While we don’t comment on individual employee matters, this is a false characterization of events," Facebook spokesperson Joe Osborne said of this incident.

But in just the past few weeks, at least four people involved in critical integrity work related to reducing violence and incitement, crafting policy to reduce hate speech, and tracking content that breaks Facebook’s rules have left the company. In farewell posts obtained by BuzzFeed News, each person expressed concerns about the company’s approach to handling US political content and hate speech, and called out Facebook leadership for its unwillingness to be more proactive about reducing hate, incitement, and false content.

“We have made massive integrity investments over the past four years, including substantially growing the team that works to keep the platform safe, improving our ability to find and take down hate speech, and protecting the US 2020 elections," said Osborne. "We’re proud of this work and will continue to invest and build on learnings moving forward.”

The departures come as Facebook’s “election integrity” effort is undergoing major changes in the wake of the 2020 US election. The Information recently reported that the company’s civic integrity team — which was charged with "helping to protect the democratic process" and reducing "the spread of viral misinformation and fake accounts" — was recently disbanded as a stand-alone unit. It also reported that a proposal from the company’s integrity teams to throttle the distribution of false and misleading election content from prominent political accounts, like President Donald Trump’s, was shot down by company leadership.

"We continue to have teams and people dedicated to elections work, and will expand on the work of our civic team to other focus areas for the company," Osborne said about changes to the civc integrity team. "Integrity touches everything we do, which is why it’s important to grow and better integrate the teams doing this work across the organization."

As Facebook restructures its integrity team, some employees leaving say they do not have confidence in the company to fix or even temper its problems. While Facebook publicly claims its artificial intelligence measures catch a lot of offending content before it’s reported by users, there have been manifold examples where that system failed, leading to disastrous real-life consequences like a Facebook-organized militia event that coincided with the shooting deaths of two protestors in Kenosha, Wisconsin.

“AI will not save us,” wrote Nick Inzucchi, a civic integrity product designer who quit last week. “The implicit vision guiding most of our integrity work today is one where all human discourse is overseen by perfect, fair, omniscient robots owned by [CEO] Mark Zuckerberg. This is clearly a dystopia, but one so deeply ingrained we hardly notice it any more.”

The departing data scientist expressed similar concerns. “Our current approach to automation is not going to solve most of our integrity problems,” they wrote.

Their post also argued, with data, that Facebook’s “very apparent interest in propping up actors who are fanning the flames of the very fire we are trying to put out” makes it impossible for people to do their jobs. “How is an outsider supposed to believe that we care whatsoever about getting rid of hate speech when it is clear to anyone that we are propping it up?” they asked.

Using data from a Facebook tool called the “Hate Bait dashboard,” which can track content from groups and pages that leads to hateful interactions, the data scientist listed the 10 US pages with the “largest concentrated volume of likely violating Hate Speech comments” in the past 14 days. All were pages associated with conservative outlets or personalities, including Breitbart News, Fox News, the Daily Caller, Donald Trump’s campaign and main account, and Ben Shapiro. They also shared a sample of the hateful comments posted on a recent Breitbart News post about Nancy Pelosi’s support for transgender athletes.

The Hate Bait Dashboard

A Facebook data scientist used the company’s “Hate Bait dashboard” to show the top 10 US pages over a recent period of 14 days that led to the most hateful interactions. BuzzFeed News has recreated the post below:



“I can’t overemphasize that this is a completely average run-of-the-mill post for Breitbart or any of the other top Hate Bait producers,” they wrote, referring to comments that called for the assault or killing of trans individuals.

“They all create dozens or hundreds of posts like this [a] day, each eliciting endless volumes of hateful vile comments—and we reward them fantastically for it,” they wrote, emphasizing their own words.

The data scientist’s comments about preferential treatment of US conservative pages and voices follow other internal concerns and evidence. In November, a departing member of Facebook’s policy organization wrote that “right-leaning US accounts are far more likely than others to engage in hate speech and violence incitement and far more likely to spread misinformation.” The person declined to speak with BuzzFeed News and asked that their name not be published for fear of retaliation.

“Very conservative and conservative users shared more than seven times as much misinformation on the platform in a given period as did moderate, liberal, or very liberal users,” they wrote, linking to a news article of a recent study to support their claim.

Another Facebook core data scientist of more than five years, also quit the company this week. They asked BuzzFeed News not to be named in case of reprisal, but echoed their colleagues' concerns in their farewell post.

“I think Facebook probably has a net negative effect on the quality of political discourse, at least in Western counties,” they wrote. They described a randomized trial that found that “paying people to stop using Facebook for a month caused a decline in negative partisan feelings by around 0.1 [standard deviation].”

The core data scientist also said they had seen “a dozen proposals to measure the objective quality of content on News Feed diluted or killed because … they have a disproportionate impact across the US political spectrum, typically harming conservative content more.” Beyond that, they added that Facebook’s content policy decisions are “routinely influenced by political considerations” to ensure the company “avoids antagonizing powerful political players.”

The other departing data scientist said their team’s incremental changes can improve things on the margin but lack any major impact given Facebook’s scale. For example, they cited a colleague who was able to detect and prevent an additional 500,000 views of English language hate content in the US a day. But they pointed to Trump’s infamous “When the looting starts, the shooting starts” post as an example of how the team’s wins were eclipsed by Facebook’s unwillingness to take action against prominent purveyors of hate speech or violent incitement.

“Trump’s ‘Looting and Shooting’ post was viewed orders of magnitude of times more than the total number of views that we prevent in a day,” they wrote, citing the president’s May post, which was not taken down by the company despite its suggestion that those protesting the police killing of George Floyd be shot. “It’s not hard to draw a straight line from that post to the actual shooting that took place at protests in the months that followed.”

Despite their pointed criticism, the data scientist commended what they said was an improvement of Facebook’s “real-time monitoring,” which was apparently able to prevent “a number of events that might have ended up being similar to Kenosha.” They said the company enacted dozens of interventions that they believe led to a lower-than-expected amount of violence and incitement reports during and following Election Day. For context, the data scientist shared a graph that showed user reports for violence and incitement peaked at around 50,000 in the days after the election, while the top day during the height of protests in the wake of the police killing of George Floyd saw more than 125,000 reports.

BuzzFeed News previously reported the existence of a “violence and incitement trends” metric that jumped 45% over five days in the period after Election Day as misinformation spread about the totaling of votes in the recent presidential election.

The data scientist also offered parting advice to Facebook: “hire more people.” Noting that the company was able to afford it, they suggested doubling the size of employees in the integrity organization. They said it would allow the company to branch out from focusing on US and English-centric content, a concern of another former data scientist.

The four departures and their badge posts triggered a discussion on Facebook’s internal message boards this week, with employees concerned that they could kick off a wider trend. “To what extent are integrity leadership and FB INC leadership hearing and responding to these criticisms?” one person posted to an internal group called “Let’s Fix Facebook.”

Internal survey data from more than 49,000 Facebook employees first reported by BuzzFeed News last month showed that only 51% of respondents said they believed Facebook was having a positive impact on the world. A question about the company’s leadership saw only 56% of employees offer a favorable response.

On Thursday, Zuckerberg detailed some of the turnover and subsequent hiring at the company. In a company-wide meeting, he said he was proud of employees' ability to adapt to the new normal of working from home, particularly for those people who joined recently and have yet to see the inside of a Facebook office.

"I think we now have 20,000 employees who've never been in our office because they joined this year," he said to the more than 50,000 employees at the company. The Facebook chief also said he expects some people to start coming back to offices next year, though the company would not make a COVID-19 vaccine mandatory for returning workers.

As morale inside Facebook plummets, leaders are growing more concerned about employee discussions occurring on the company’s internal forums. Once known for its open culture, Facebook and CEO Zuckerberg, who are staring down landmark antitrust lawsuits from 48 attorneys general and the Federal Trade Commission, repeatedly advised employees this week to avoid discussing anything related to the ongoing litigation given the possibility of legal discovery.

Employees have also been required to take online “Competition Training” courses to understand “Competition Compliance Policy.” The training instructs people who are discussing “complicated issues” to meet in person or over videoconference, while reminding them that others may have access to their communications.

Facebook was quick to shield the departing data scientist’s post from outside eyes. Having initially published their writing in a Google Document to create “an additional layer of security” and prevent it from getting to this news outlet, they were asked by an internal moderator to take the post off Google’s tools.

“Your badge post just got flagged for having confidential information in it,” the moderator wrote to the data scientist.

Newsletter

Related Articles

0:00
0:00
Close
UK Prime Minister Starmer Reaffirms Support for Danish Sovereignty Over Greenland Amid U.S. Pressure
UK Support Bolsters U.S. Seizure of Russian-Flagged Tanker Marinera in Atlantic Strike on Sanctions Evasion
The Claim That Maduro’s Capture and Trial Violate International Law Is Either Legally Illiterate—or Deliberately Deceptive
UK Data Watchdog Probes Elon Musk’s X Over AI-Generated Grok Images Amid Surge in Non-Consensual Outputs
Prince Harry to Return to UK for Court Hearing Without Plans to Meet King Charles III
UK Confirms Support for US Seizure of Russian-Flagged Oil Tanker in North Atlantic
Béla Tarr, Visionary Hungarian Filmmaker, Dies at Seventy After Long Illness
UK and France Pledge Military Hubs Across Ukraine in Post-Ceasefire Security Plan
Prince Harry Poised to Regain UK Security Cover, Clearing Way for Family Visits
UK Junk Food Advertising Ban Faces Major Loophole Allowing Brand-Only Promotions
Maduro’s Arrest Without The Hague Tests International Law—and Trump’s Willingness to Break It
German Intelligence Secretly Intercepted Obama’s Air Force One Communications
The U.S. State Department’s account in Persian: “President Trump is a man of action. If you didn’t know it until now, now you do—do not play games with President Trump.”
Fake Mainstream Media Double Standard: Elon Musk Versus Mamdani
HSBC Leads 2026 Mortgage Rate Cuts as UK Lending Costs Ease
US Joint Chiefs Chairman Outlines How Operation Absolute Resolve Was Carried Out in Venezuela
Starmer Welcomes End of Maduro Era While Stressing International Law and UK Non-Involvement
Korean Beauty Turns Viral Skincare Into a Global Export Engine
UK Confirms Non-Involvement in U.S. Military Action Against Venezuela
UK Terror Watchdog Calls for Australian-Style Social Media Ban to Protect Teenagers
Iranian Protests Intensify as Another Revolutionary Guard Member Is Killed and Khamenei Blames the West
Delta Force Identified as Unit Behind U.S. Operation That Captured Venezuela’s President
Europe’s Luxury Sanctions Punish Russian Consumers While a Sanctions-Circumvention Industry Thrives
Berkshire’s Buffett-to-Abel Transition Tests Whether a One-Man Trust Model Can Survive as a System
Fraud in European Central Bank: Lagarde’s Hidden Pay Premium Exposes a Transparency Crisis at the European Central Bank
Trump Announces U.S. Large-Scale Strike on Venezuela, Declares President Maduro and Wife Captured
Tesla Loses EV Crown to China’s BYD After Annual Deliveries Decline in 2025
UK Manufacturing Growth Reaches 15-Month Peak as Output and Orders Improve in December
Beijing Threatened to Scrap UK–China Trade Talks After British Minister’s Taiwan Visit
Newly Released Files Reveal Tony Blair Pressured Officials Over Iraq Death Case Involving UK Soldiers
Top Stocks and Themes to Watch in 2026 as Markets Enter New Year with Fresh Momentum
No UK Curfew Ordered as Deepfake TikTok Falsely Attributes Decree to Prime Minister Starmer
Europe’s Largest Defence Groups Set to Return Nearly Five Billion Dollars to Shareholders in Twenty Twenty-Five
Abu Dhabi ‘Capital of Capital’: How Abu Dhabi Rose as a Sovereign Wealth Power
Diamonds Are Powering a New Quantum Revolution
Trump Threatens Strikes Against Iran if Nuclear Programme Is Restarted
Apple Escalates Legal Fight by Appealing £1.5 Billion UK Ruling Over App Store Fees
UK Debt Levels Sit Mid-Range Among Advanced Economies Despite Rising Pressures
UK Plans Royal Diplomacy with King Charles and Prince William to Reinvigorate Trade Talks with US
King Charles and Prince William Poised for Separate 2026 US Visits to Reinforce UK-US Trade and Diplomatic Ties
Apple Moves to Appeal UK Ruling Ordering £1.5 Billion in Customer Overcharge Damages
King Charles’s 2025 Christmas Message Tops UK Television Ratings on Christmas Day
The Battle Over the Internet Explodes: The United States Bars European Officials and Ignites a Diplomatic Crisis
Princesses Beatrice and Eugenie Join Royal Family at Sandringham Christmas Service
Fine Wine Investors Find Little Cheer in Third Year of Falls
UK Mortgage Rates Edge Lower as Bank of England Base Rate Cut Filters Through Lending Market
U.S. Supermarket Gives Customers Free Groceries for Christmas After Computer Glitch
Air India ‘Finds’ a Plane That Vanished 13 Years Ago
Caviar and Foie Gras? China Is Becoming a Luxury Food Powerhouse
Hong Kong Climbs to Second Globally in 2025 Tourism Rankings Behind Bangkok
×