TikTok moderators have accused the social media company of “oppressive and intimidating” union busting after it fired hundreds of workers in the UK, beginning the process just before they were due to vote on forming a union.
The moderators wanted to establish a collective bargaining unit to protect themselves from the personal costs of checking extreme and violent content, and have claimed TikTok is guilty of unfair dismissal and breaching trade union laws.
About 400 moderators in London were fired before Christmas in a process initiated a week before the vote was due to take place.
TikTok, which has about 30m monthly users in Britain, strongly denies a legal claim that has been lodged with an employment tribunal on behalf of three former workers, describing it as “baseless”.
It said the sackings were part of a global restructuring involving roles in the UK, and south and south-east Asia amid the increasing use of AI to automate the removal of posts that violate content rules, with 91% of transgressive content now removed automatically.
But John Chadfield, the national officer for tech workers at the Communication Workers Union, which represented about 250 of the affected moderators, said of the legal action: “This is holding TikTok to account for union busting.”
He added: “Content moderators have the most dangerous job on the internet. They are exposed to the child sex abuse material, executions, war and drug use. Their job is to make sure this content doesn’t reach TikTok’s 30 million monthly users. It is high pressure and low paid. They wanted input into their workflows and more say over how they kept the platform safe. They said they were being asked to do too much with too few resources.”
A TikTok spokesperson said: “These changes were part of a wider global reorganisation, as we evolve our global operating model for trust and safety with the benefit of technological advancements to continue maximising safety for our users.”
The dispute began in August 2025, when the union had been poised to ballot several hundred of TikTok’s moderators and quality assurance agents from the trust and safety team, whose job was to vet posts for compliance with TikTok’s rules – including traumatic posts processed at high speed. TikTok announced a restructuring exercise that put members of the proposed bargaining unit at risk of redundancy, according to the legal claim.
Rosa Curling, a co-executive director of the tech justice non-profit Foxglove, which is backing the action, called TikTok’s treatment of its content moderators “appalling”.
She said by “laying off essential safety workers they are putting the platforms’ users at risk”, adding: “TikTok has made its position clear: union busting and trampling on our labour laws comes first – the safety of its users, including millions of children, and the wellbeing of its essential safety workers, comes last. We hope the employment tribunal will force them to change course.”
According to TikTok, its increasing use of AI has reduced moderators’ exposure to graphic content by 76% in the past year.
Michael Newman, a partner at the law firm Leigh Day, said: “This case is an important example of how individuals who band together can stand up to the might of big tech firms, and especially of how the fig leaf of AI cost savings should not be allowed to obscure vital safety concerns.”







