Sign In  |  Register  |  About Burlingame  |  Contact Us

Burlingame, CA
September 01, 2020 10:18am
7-Day Forecast | Traffic
  • Search Hotels in Burlingame

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

Facebook and Instagram accused of allowing predators to share tips with each other about victimizing children

New Mexico AG Raúl Torrez is bringing legal action against Meta, because he alleges they rejected its own safety teams’ recommendations to make it harder for adults to communicate with children.

Meta, which includes Facebook and Instagram, is accused of facilitating and profiting off of the online solicitation, trafficking and sexual abuse of children, according to a complaint filed by the New Mexico Attorney General. 

New Mexico AG Raúl Torrez is bringing legal action against Meta, accusing the company of permitting sponsored content to appear alongside inappropriate content in violation of Meta’s standards, allowing child predators who use dark web message boards to share tips with each other about victimizing children, and reportedly rejecting its own safety teams’ recommendations to make it harder for adults to communicate with children on its platforms, according to an updated complaint reviewed by Fox News Digital. 

"Parents deserve to know the full truth about the risks children face when they use Meta’s platforms," AG Torrez told Fox News Digital. "For years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to dangerous solicitations and sexual exploitation."

Torrez claimed in the original complaint covered by Fox News Digital in December that minors’ accounts on Meta's platform were recommended to apparent child predators. In addition, the company allegedly failed to implement proper protections to prevent users under the age of 13 from joining and instead targeted the vulnerabilities of young children to increase advertising revenue. 

META IS ‘NEW TOBACCO’ AFTER STATES SUE OVER ALTERED REALITIES FOR KIDS, LEGAL GURU SAYS: ‘ELECTRONIC MORPHINE’

The state AG's office ran an investigation on Instagram and Facebook where they created decoy accounts of children 14-years and younger and found that the social media platforms directed underage users to "a stream of egregious, sexually explicit images — even when the child has expressed no interest in this content;" enabled dozens of adults "to find, contact, and press children into providing sexually explicit pictures of themselves or participate in pornographic videos;" recommended that children join unmoderated Facebook groups "devoted to facilitating commercial sex;" and allowed users "to find, share, and sell an enormous volume of child pornography." 

Investigators at the AG's office also claimed "certain child exploitative content" is over ten times more prevalent on Facebook and Instagram than it is on Pornhub and OnlyFans. 

One Facebook account created by investigators under the name Rosalind Cereceres, a 40-year-old fictional "bad mother" to 13-year-old, Issa Bee, incorporated signals that she was interested in trafficking her daughter. Most of Issa’s followers were males between the ages of 18 and 40-years old and comments were sometimes sexually suggestive or threatening, the complaint said. 

"On Facebook Messenger, Issa’s messages and chats are filled with pictures and videos of genitalia, including exposed penises, which she receives at least 3-4 times per week," the complaint stated. "As the messages come in, she has no means of screening or previewing the messages."

Torrez said the newly unredacted aspects of the complaint help form the basis of their action against Meta and "make clear" that "Mark Zuckerberg called the shots" in making the decisions that mattered to children and parents. 

"Meta executives, including Mr. Zuckerberg, consistently made decisions that put growth ahead of children’s safety," he said in his statement to Fox News Digital. "While the company continues to downplay the illegal and harmful activity children are exposed to on its platforms, Meta’s internal data and presentations show the problem is severe and pervasive."

The complaint includes communications from some of Meta’s largest advertisers, such as Walmart and The Match Group, indicating the companies found evidence Meta was allowing sponsored content to appear alongside explicit or sexual images and videos. 

Match's advertisements were displayed "adjacent to or in connection with explicit content that violated Meta’s Community Standards," they said. Match provided a description of a string of Reels that they believed was "clearly promoting illegal and exploitative businesses" that was shown nearby an ad for one of the brand's dating apps. 

The ads were reportedly of "young girls," including a "[y]oung girl provocatively dressed, straddling and caressing a Harley Davidson-style motorcycle" to the soundtrack of Judas Priest’s Turbo Lover with the text "Turbo loving model," according to the complaint. 

"On top of this, we have also become aware that our ads are showing up on Facebook next to gruesome content in a group titled: ‘Only women are slaughtered’ showing films of women being murdered," according to an email from Match included in the complaint. "We are also aware that this account was reported twice and has not been taken down." 

According to the complaint, Meta told the advertisers it applied "standards to ensure advertisers could control the content around which their ads were run." Although, AG Torrez believes such claims are "misleading and deceptive."

Walmart raised similar concerns with Meta regarding the placement of its advertising and Meta confirmed that some of the company's advertisements were being displayed on unapproved channels, adding "there is some minimal exposure to placements that you’ve not approved," including a video of a woman exposing her crotch, according to reporting by the Wall Street Journal. 

DOZENS OF STATES SUE META OVER SOCIAL MEDIA 'PROFOUNDLY ALTERED' MENTAL, SOCIAL REALITIES OF AMERICAN YOUTH

Meta told Fox News Digital that it does not want this kind of content on its platforms and brands also don't want their ads to appear next to it. 

"We continue to invest aggressively to stop it - and report every quarter on the prevalence of such content, which remains very low," a spokesperson told Fox News Digital. "Our systems are effective at reducing violating content, and we’ve invested billions in safety, security and brand suitability solutions."

The amended complaint also includes online communications from child predators using dark web message boards to share tips with each other about victimizing children.

"Forums on the dark web devoted to child sex abuse –which are equally available to Meta – show open conversations about the role of Instagram’s and Facebook’s algorithms in delivering children to predators," the complaint stated. "Predators discuss leveraging Instagram to search, like, and comment on images of children in order to get the algorithm to funnel similar images, videos, and accounts to their feeds, to identify groups of pedophiles and children, and to connect with potential child victims." 

"Predators also note that they prefer Instagram to identify children not only because its focus on images and videos allows them to assess targets, but because of the ease of interacting with children and the prevalence of public accounts for teenagers," the complaint added. 

One participant explained that on Instagram, if you search for "whatever your preference is you will obviously come across videos of young girls" and "The more you watch of the young girls and like their videos, your reel algorithm changes and I have been getting nothing but young girls dancing, lifting their loose fitting clothes and booty shorts or dresses."

"The more you like, the more it shows the same thing," a participant in a dark web forum noted. "Crazy right? The more LG [little girl] reels and posts you like, the more appear," 

"Instagram has become a primary outlet for me lately, no question," another said. "My feed [of content recommended by the algorithm] has gotten so bad I can’t look at it in public."

Predators also discussed their tactics for engaging with children, according to the complaint. One participant, for example, wrote on a forum asking for advice about how to approach messaging a 10-year-old girl who the user had followed on Instagram.

"I have followed her back," he said. "She is really nice but lives far away. I’m thinking about sending her a DM to say she is nice and I like her. Do you think I’m getting in trouble?" 

"You need to use Facebook or Twitter or Instagram is better, you know, put 50 likes and there is already a reason to talk," another user said. "Besides, you see how the baby reacts! On Instagram it is very easy to just ask for a couple of intimate photos and then send your own and if you like each other go to a meeting in a cozy place." 

META MAY BE USING YOUR FACEBOOK, INSTAGRAM TO ‘FEED THE BEAST’ OF NEW TECH

"Instagram isn’t cracking down and let’s face it. It’s all about money," one user said. "These girls are bringing in money for themselves, their families, and Instagram!!"

New evidence in the complaint suggests that Meta rejected its own safety teams’ recommendations to make it harder for adults to communicate with children on its platforms, adding that "Facebook was aware, as of at least 2018, that there were millions of predators on the platform grooming tens of millions of children."

"Facebook employees presented senior executives with an analysis based on an internal investigation that revealed that predators were being introduced to children through Facebook design feature People You May Know ("PYMK")," the complaint said. "Through PYMK, predators searching for a particular kind of children (e.g., gymnasts or children in their geographic area), Facebook would recommend and then allow them to connect to other similar children."

"Because of the way in which Facebook designed PYMK, and knowingly allowed it to continue to operate, PYMK served as a virtual victim identification service, identifying children that pedophiles could not have found on their own," the complaint added. "Despite evidence of the link, and in order not to impair growth, Facebook rejected the recommendation by its Community Integrity team to adjust the design of PYMK to avoid recommending minors to adults and rejected the recommendation that it restrict adults from direct messaging minors."

GET FOX BUSINESS ON THE GO BY CLICKING HERE

The complaint also details a 2018 email response from Guy Rosen, Facebook's Vice President of Integrity, where he noted Facebook does not scan direct messages for violating conduct, such as grooming and solicitation. Rosen offered to provide Instagram head Adam Mosseri with a "shadowing session" to demonstrate the "kind of stuff" that investigators find, describing it as the "worst of the worst stuff that gives you nightmares, over which children have taken their own lives :(." 

A Meta spokesperson told Fox News Digital that child exploitation "is a horrific crime and online predators are determined criminals."

"We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators," the statement added. "In one month alone, we disabled more than half a million accounts for violating our child safety policies."

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Copyright © 2010-2020 Burlingame.com & California Media Partners, LLC. All rights reserved.