Network of online predators convince young victims to harm themselves, animals

The FBI says a network of online predators has been using common social media and gaming sites to find young victims, with the goal of grooming, and then manipulating the victims to share violent images of themselves self-mutilating, killing pets, or sharing sexual images. 

"I've been working on extremism for maybe about seven years now. This is some of the worst stuff I've ever seen," said Ali Winston, an investigative journalist with Wired.

Winston and Wired joined an international team of journalists from the Washington Post, Der Spiegel in Germany, and the Recorder in Romania, who collaborated on a joint investigation into the international reach of an online group called "764."

"The stories that we heard from the victims, a number of them we interviewed for our collaborative, they were heart renerding," Winston said.

Winston says the group's name comes from the zip code of the founder, a young man from Texas currently serving a life sentence.

Members of 764 have gathered on popular social media sites and prey on young people, convincing them to livestream or post videos of violent or sexual acts for the group's pleasure.

Winston said victims would often be asked to harm themselves by inflicting cuts or so-called "cut signs".

"It's when you carve that person's handle into your skin with a blade or a razor or another sharp object. This is a modus operandi of 764. They do this all the time, and some of the cut signs that I've seen. They're horrific," Winston said.

The group's members have also ordered victims to commit violent acts on animals.

"That's something else that 764 asks people to do, to hurt their pet," Winston said. "There's a young woman who beheaded her hamster live in what they call a "red room," which is basically a live stream room on Discord in video chat."

The FBI issued an alert this past fall about 764 and similar extreme groups.

Special Agent in Charge (SAC) Robert Tripp at the FBI's San Francisco office, and says parents should be aware of these networks and know that these predatory groups aren't just operating on the dark web or obscure corners of the internet.

"These predators are identifying their victims in very public-facing social media platforms. That's where they find these people. They find them in the social media platforms that all of the kids are using, that our families are using," Tripp said.

That includes many of the social media platforms that are based right in the Bay Area, or used by children here.

When asked about companies such as Instagram, Telegram, Discord, Twitter/X, and Roblox, Tripp replied, "Any public-facing platform is a potential vector for this kind of activity."

Tripp says money is not the main motive of groups such as 764.

"The difference is the element of violence. These people aren't merely lacking empathy with their victims. They actually derive pleasure from making their victims suffer. And that is truly disturbing for us. That's why it's such a high priority for us," Tripp said. "The primary age range we see is 8 to 17...(that) is where we're concerned."

Young people who feel isolated, insecure, or struggle with mental health issues are often prime targets for 764 members.

"764 one hundred percent seeks out and preys on children, young folk, with mental health issues," Winston said. "They have an entire guide that they've crafted and distributed among themselves on how to identify those folks, how to bring them into a relationship."

Tripp says the FBI's San Francisco office is working closely with social media companies to investigate.

"We have a good relationship with many providers here in the Bay Area," Tripp said. "When they come across information, they'll block it. They'll take it down, and they'll notify law enforcement."


Last month, The Tech Coalition, which represents some three dozen major tech companies, released a report on its new effort to crack down on predators such as the 764 group.

The initiative called Lantern was launched this past November. It is a partnership of 12 companies that agree to share information about online predators across platforms. Those companies are Discord, Dropbox, Google, Mega, Meta, Photobucket, Quora, Reddit, Roblox, Snap Inc., and Twitch.

The Lantern Transparency report says the companies "took action on 30,989 accounts for violations of policies prohibiting child sexual exploitation and abuse."

Protecting young people is an increasing challenge, however, according to a report by the National Center For Missing And Exploited Children (NCMEC) which runs a CyberTipline and acts as a central hub for receiving reports of abuse and sharing information with law enforcement agencies.

NCMEC says in 2023, the CyberTipline received more than one million reports. Their data shows 110,784 reports were from California.

The NCMEC also says that while electronic communication service providers or ECSPs are required to report bad actors now, the center says there is concern that new technology could drastically reduce the number of tips they receive.

The NCMEC 2023 transparency report stated "widespread adoption of end-to-end encryption by reporting ESPs will begin at some point in CY 2024, and could result in a loss of up to 80% of NCMEC’s CyberTipline reports."

A new study from Stanford University raises other concerns.

The Stanford Internet Observatory Team published a study in April that analyzed NCMEC's system with a range of recommendations for NCMEC, law enforcement, tech companies, and the government.

"The challenges facing the CyberTipline will be massively multiplied by the coming wave of unique, AI-generated CSAM that platforms will be reporting over the next several years," according to the report, "These issues would be best addressed by a concerted effort to massively uplift NCMEC’s technical and analytical capabilities, which will require the cooperation of platforms, NCMEC, law enforcement and, importantly, the US Congress."

The FBI says one important factor that can make an immediate difference, is for family members, friends, teachers, and other trusted adults to watch young people for any signs that something might be wrong.

"If they see their children suddenly become very withdrawn, if they see their children covering up, hiding their arms and legs for example, because cutting is a part of this, those are warning signs," Tripp said. "Mood changes, profound mood changes can be an indicator as well."

Most important, Tripp says, is for young people and trusted adults to have a frank conversation about interactions online.

"There are people on the internet that may pretend to be your friends, but they're not your friends. Having that kind of conversation," Tripp said.

As May marks Mental Health Awareness Month, the FBI says it also is important for young people and parents to know where to turn for help and support.

To contact the FBI, there is a help line at 1-800-CALL-FBI where people can report online predators, child exploitation, and abuse.

To reach the National Center for Missing and Exploited Children call 1-800-THE-LOST. 

The NCMEC has a program called Take it Down that can work with social media sites to track down predators and remove images of victims that predators might have posted online.

Jana Katsuyama is a reporter for KTVU. Email Jana at jana.katsuyama@fox.com or call her at 510-326-5529. Or follow her on Twitter @JanaKTVU. 
 

NewsMental HealthSocial Media