Meta AI layoffs in Ireland: Covalen workers face cut risks

Meta AI – Hundreds of Covalen workers in Ireland—tasked with training Meta’s AI safety systems—were told their jobs may be cut as Meta shifts toward internal AI enforcement.
Meta’s AI safety push is coming with a painful warning in Ireland: hundreds of workers at Covalen have been told their roles may be at risk as the company prepares another round of layoffs.
For readers. the story lands at the intersection of technology and employment—where content moderation used to rely on people. and now increasingly depends on models that those same people help refine.. Misryoum’s newsroom reporting shows that the people doing the training work are not simply “support staff”; they are part of the loop that teaches AI how to spot dangerous and illegal content.
Covalen’s training work is being reframed as “optional”
Covalen, a Dublin-based vendor for Meta, provides services tied to content moderation and labeling. Among the roles most exposed are data annotators who review material produced or flagged by Meta’s AI systems to ensure it aligns with company rules against dangerous and illegal content.
Employees described their day-to-day work as iterative and adversarial: they don’t just tag content. they probe whether models will follow guardrails.. That often means crafting prompts—sometimes elaborate ones—intended to test whether the system will resist requests involving child sexual abuse material or self-harm content.. One worker’s account captured the psychological cost: spending a workday simulating prohibited scenarios as part of keeping the AI “safe.”
At the center of the latest risk is a tension that unions and workers have been pointing to for some time: training processes can blur into replacement strategies.. One anonymous Covalen employee described the work as “training the AI to take over our jobs. ” saying their actions represent the decisions the model is meant to emulate.
A wider Meta layoff plan meets vendor dependence
Meta’s broader strategy is also part of why this situation feels so abrupt.. The company recently outlined sweeping job cuts aimed at improving efficiency, while also signaling major increases in spending on AI.. Even when memos avoid explicit references to automation. the direction is clear: more responsibility will shift to systems built and run inside Meta.
In the documents reviewed by Misryoum. Covalen workers were told layoffs would follow “reduced demand and operational requirements.” That language can be familiar to anyone who has seen tech workforce cuts before. but it can be especially unsettling in this case because the tasks being performed are directly tied to the AI safety pipeline.
Meta’s statement frames the change as a move toward deploying “more advanced AI systems” and reducing reliance on third-party vendors. In practice, that means the outsourcing layer used to scale enforcement operations gets thinner as internal systems grow.
What the timing says about AI enforcement
This is not the first time Covalen has been hit.. Misryoum’s review of the reporting indicates layoffs at Covalen have already occurred in the last year. including a period that reportedly culminated in labor action.. Taken together, the pattern suggests that vendor contracts may be shrinking as model capabilities and internal tooling advance.
The most destabilizing detail for workers is not only the number of roles potentially affected—more than 700 is cited in the material—but also the structure of the transition.. A “cooldown period” described by unions would restrict affected workers from applying to competing Meta vendors for six months.. For people building careers in a narrow segment of the AI safety economy. that delay can matter as much as the severance offer.
The human cost: training an AI that may outlast you
There’s a specific kind of cognitive load in moderation work that the public often overlooks.. It is rarely passive.. Workers are asked to review content that can be disturbing. then to translate judgment into data the model can learn from.. When the training workflow becomes more automated—or when a company decides fewer third-party reviewers are needed—workers can feel their experience is being repackaged as fuel for their own redundancy.
That emotional impact is echoed in workers’ descriptions. One employee characterized the process as “undignified” and “rude,” pointing to how little room there was for discussion during a brief video meeting in which they were told their jobs were at risk.
Why unions are pushing for rules around AI labor
Labor organizations representing the affected staff are calling for negotiations over severance terms and for talks with the Irish government on how AI is changing work.. Misryoum’s coverage reflects a broader argument from unions: tech companies often treat the labor and data that build AI as disposable—without adequate notice. training pathways tied to employment. or enforceable rights to refuse replacement training.
The central demand is not anti-technology.. It’s about guardrails for people who do the work.. If AI is going to be trained with labor inputs. unions argue those workers should have clearer protections: advance warning. documented transition plans. and meaningful options beyond “wait and hope” after a contract ends.
What happens next for Ireland’s AI workforce
For affected workers. the near-term challenge is obvious: moving to stable employment while the market itself is being reshaped in real time by AI-driven cost cutting.. Misryoum’s reporting suggests many will be competing not just for general tech roles. but for positions across a relatively specialized ecosystem tied to AI enforcement and labeling.
Longer term. the question is whether companies will treat AI enforcement like a constantly evolving internal function—or whether it will continue to rely on a rotating layer of contractors whenever scaling is needed.. If the direction stays consistent. vendor dependence may shrink in Ireland. but the need for safety review and risk handling won’t disappear.. It will simply move—either into new internal roles, or into different outsourced configurations.
For now, Covalen workers are left with uncertainty, limited mobility due to the cooldown period, and a debate that has become louder across the industry: what does it mean to train an AI to do a job—and then decide the people who taught it no longer belong in the workflow?