Job Cuts Loom for TikTok's Ireland Operations Amid Global Restructuring
Around 300 positions are at risk in TikTok's Dublin office as the company undergoes a global redundancy programme.
TikTok is reportedly facing significant job cuts at its operations in Ireland, with approximately 300 roles at risk amid a global redundancy programme.
The company employs nearly 3,000 people at its Dublin offices, where the workforce has expanded rapidly in recent years.
The Irish Government has been informed about the potential job losses, which were officially communicated through a collective redundancy notification received by the Department of Enterprise on March 4, 2025.
In a statement, a spokesperson for the Department of Enterprise confirmed receipt of the notification and indicated that any further inquiries should be directed to TikTok.
The company’s Dublin headquarters relocated to The Sorting Office in the Docklands area in December 2023. TikTok has not yet commented publicly on the impending job cuts.
In a separate development, the Information Commissioner’s Office (ICO) in the UK has launched investigations into TikTok, along with Reddit and Imgur, concerning the protection of child users' privacy.
The ICO's investigations focus on how these platforms handle personal information of users aged 13 to 17, particularly regarding content recommendations based on their data.
The regulator's actions come amid rising concerns about the use of data generated by children's online activity and the possibility of inappropriate or harmful content being served to younger users.
The ICO has previously established a children's code for online privacy, implemented in 2021, mandating firms to take concrete steps to safeguard children's personal information.
Information Commissioner John Edwards emphasized the importance of understanding the practices in place to protect children from potential harms associated with social media use.
The ongoing investigations seek to ensure that platforms have robust measures to prevent exposing minors to dangerous content or addictive models.
Moreover, Ofcom, the online safety regulator, has mandated that social media platforms submit risk assessments by March 31 to evaluate the likelihood of users encountering illegal content on their services.
The assessments are a critical component of compliance with the Online Safety Act, which enforces standards aimed at preventing illegal content, including child exploitation, terrorism, and hate speech.
Non-compliance could result in heavy fines or, in extreme cases, a court order to block access to a site within the UK. Ofcom's enforcement director has reiterated the necessity of these risk assessments to enhance user safety and ensure platforms adopt a proactive approach to managing potential online risks.