A charity's report indicates strong public disapproval of Meta's changes aimed at increasing free expression.
A recent report from the Molly Rose Foundation highlights significant public disapproval regarding Meta's recent relaxation of content moderation policies.
The foundation's study, which surveyed over 2,000 adults, found that 86% believe social media platforms should be legally obligated to proactively search for harmful content.
In January 2023,
Mark Zuckerberg, Meta's CEO, announced these policy changes as part of a broader strategy to enhance free expression by reducing what he described as excessive censorship.
As a consequence of these changes, Meta intends to limit proactive scanning for harmful content, shifting the responsibility primarily to users to report such content.
The report specifically addresses the shift in content moderation, revealing that 71% of respondents oppose Meta's decision to lessen automatic content removals.
The Molly Rose Foundation was established following the tragic death of Molly Russell in November 2017, who was 14 at the time and had encountered suicide and self-harm content online.
In light of Meta's policy adjustments, the charity has raised concerns about the potential risks to young users, warning that the changes could enhance the likelihood of encountering harmful material.
The foundation has called upon the government to fortify the Online Safety Act, advocating for measures that prevent social media companies from enacting similar policy changes.
Andy Burrows, CEO of the Molly Rose Foundation, expressed concerns that Zuckerberg's changes could pose significant risks to children and young people, increasing their vulnerability to suicide, self-harm, and depression.
He emphasized the need for decisive regulatory action, stating that any weakness in the regulatory response could have life-threatening consequences.
He urged that decisions regarding the safety of children on online platforms should originate from democratically elected officials rather than private tech executives.
Burrows pointed to the urgency of public expectations for action to ensure that online safety measures do not regress.
During the initial announcement of the policy changes, Zuckerberg articulated his viewpoint that the adjustments would enhance free expression and criticized what he described as politically motivated censorship from various governments.
He indicated a collaborative effort with political figures to counteract perceived overreach by foreign governments against American tech firms.
Meta's policy changes have encountered backlash from online safety experts who caution that the new approach could lead to a broader dissemination of harmful content.
The company plans to implement relaxed moderation in areas frequently discussed in political contexts, such as immigration and gender.
In addition to these changes, Zuckerberg indicated a shift away from traditional fact-checking processes, opting instead for community-generated notes to replace what he termed politically biased fact-checkers.
The Department for Science, Innovation and Technology (DSIT) of the UK government has stated that all social media companies operating in the country are mandated to remove illegal content, including any material promoting self-harm or suicide.
Upcoming regulations under the Online Safety Act will require additional protections specifically directed at safeguarding children from exposure to harmful online content.
The government aims to monitor the impact of these laws closely and is prepared to enhance protections as necessary.