‘It started with a tipoff’: how a Guardian investigation exposed child sex trafficking on Facebook and Instagram
www.theguardian.com/global-development/2026/apr…
It started with a tipoff. I was reporting on the trafficking and exploitation of migrant workers in the Gulf when a source I had known for more than a decade reached out. They told me that child sexual abuse trafficking in the US was surging. As the Covid pandemic pushed predators online, some were using Facebook and Instagram to buy and sell children.
It was 2021 and I was about to begin an investigation with Mei-Ling McNamara, a human rights journalist, that would lead to the tech company Meta losing a multimillion-pound court case in March this year. The company had not yet rebranded and was known as Facebook, and there had not been any reporting on how children were being trafficked on its platforms. Experts from anti-trafficking nonprofit organisations and an American law enforcement official talked me through the crimes they were seeing.
Much of the trafficking on Facebook and Instagram was happening in non-public areas of the platforms, such as Facebook Messenger and private Instagram accounts, I would learn later. Traffickers were searching for teens to target and groom, and to later advertise to sex buyers.
2 Comments
Comments from other communities
I was able to pull transcripts of sale negotiations for teen girls that traffickers were engaging in on Facebook Messenger, the private messaging function. In exhibit documents, there were pictures of trafficking victims being advertised for sale in Instagram’s Stories function. Money and logistics had been discussed. In the cases we found, none of these crimes had been detected or flagged by Meta.
McNamara and I contacted former contract workers who had been employed to moderate Facebook and Instagram, tasked with reporting and removing harmful content. Many were traumatised by the content they had had to review each day. All said their efforts to flag and escalate possible child trafficking on Meta platforms often went nowhere, and harmful content was rarely taken down by the company. They felt helpless, and believed Meta’s criteria for escalating possible crimes to law enforcement was too narrow.
They make money out of trafficking and scams etc… all the interactions sell ADS ! And the scammers even buy ADS. Why would facebook remove any of it, it would need to cost them money to make it stop, so fine them and make.them accountable for the horrors they allow.
I wonder who made the original tip. Probably one of those traumatized moderators whose reports had gone nowhere.
An article from the guardian patting itself on the back. Pass.
“hey, we did the work, now we expect something to be done on it” is far from “look what we found! Give us a Pulitzer!”
But the motherfuckers at FB had the time and wherewithal to identify people trying to coordinate for out-of-state abortions of Facebook and then report them to the state and federal authorities, of course, just not human traffickers.
It’s what Baby Jesus wants ✝️