Saturday, April 27, 2024
Social

Meta warned it faces ‘heavy sanctions’ in EU if it fails to fix child protection issues on Instagram

The European Union has fired a blunt warning at Meta, saying it must quickly clean up its act on child protection or face the risk of “heavy sanctions”.

The warning follows a report by the Wall Street Journal yesterday. The WSJ worked with researchers at Stanford University and the University of Massachusetts Amherst to undercover and expose a network of Instagram accounts set up to connect pedophiles to sellers of child sexual abuse material (CSAM) on the mainstream social networking platform.

Research by Stanford also found the Meta-owned photo-sharing site to be the most important platform for sellers of self-generated CSAM (SG-CSAM), fingering Instagram’s recommendation algorithms as “a key reason for the platform’s effectiveness in advertising SG-CSAM”.

In a tweet fired at the adtech giant this morning, the EU’s internal market commissioner, Thierry Breton, said the company’s “voluntary code on child protection seems not to work”, adding: “Mark Zuckerberg must now explain and take immediate action.”

Breton said he will be raising child safety at a meeting with Zuckerberg at Meta’s HQ in the U.S. later this month — and confirmed that the EU will be applying a hard deadline on the issue by saying it expects Meta to demonstrate effective measures are in place after August 25, when the company is legally required to be in compliance with the EU’s Digital Services Act (DSA).

Fines for non-compliance with the DSA, which lays out rules for how platforms must tackle illegal content like CSAM, can scale up to 6% of global annual turnover.

Both Instagram and Facebook have been designated very large online platforms (aka VLOPs) under the DSA which brings additional obligations that they assess and mitigate systemic risks attached to their platforms, including those flowing from recommender systems and algorithms specifically. So the level of risk Meta is facing here looks substantial.

“After August 25, under #DSA Meta has to demonstrate measures to us or face heavy sanctions,” Breton warned in the tweet, which flags both the WSJ’s report and Stanford’s research paper looking at CSAM activity across a number of major social platforms which concludes that “Instagram is currently the most important platform for these networks, with features that help connect buyers and sellers”.

Breton’s threat of “heavy sanctions” if it fails to act could translate into billions (plural) of dollars in fines for Meta in the EU.

We reached out to Meta for a response to Breton’s warning on child protection but at the time of writing it had not responded.

Instagram found recommending CSAM sellers

The Journal’s investigation highlighted the role played by Instagram’s recommendation algorithms in linking pedophiles to sellers of CSAM.

“Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests, the Journal and the academic researchers found,” it wrote.

“Though out of sight for most on the platform, the sexualized accounts on Instagram are brazen about their interest. The researchers found that Instagram enabled people to search explicit hashtags such as #pedowhore and #preteensex and connected them to accounts that used the terms to advertise child-sex material for sale. Such accounts often claim to be run by the children themselves and use overtly sexual handles incorporating words such as “little slut for you”.”

Meta responded to queries put to it by the WSJ ahead of publication by saying it had blocked thousands of hashtags that sexualize children, some of which the Journal’s report specifies had millions of posts.

The tech giant also said it had restricted its systems from recommending search terms to users that are known to be associated with child sex abuse.

The WSJ’s report includes a screengrab of a pop-up served by Instagram to researchers involved in the investigation when they searched for a pedophilia-related hashtag — which warned “these results may contain images of child sexual abuse”. The text in the notification also described the legal risks of viewing CSAM, the harm sexual abuse causes to children, and suggested resources to “get confidential help” or report “inappropriate” content, before offering two options to the user: “Get resources” or “See results anyway” — suggesting the platform was aware of CSAM issues associated with the hashtags yet had failed to remove the content or even block users from accessing it.

Per the WSJ, Meta only removed the option letting users view suspected CSAM after it had asked about it and its report says the company declined to explain why it had offered such an option in the first place.

The active role of Instagram’s recommender engine in essentially promoting CSAM sellers accounts looks equally troubling, given the platform was found to be able to identify suspected CSAM — raising questions about why Meta did not leverage the behavioral surveillance it deploys on users to drive engagement (and boost its ad revenue) by matching accounts to content based on spotting similar platform activity in order to map the pedophile network and shut it down.

On this Meta told the WSJ it is working on preventing its systems from recommending potentially pedophilic adults connect with one another or interact with one another’s content.

The Journal’s reporting also chronicles instances where Instagram’s algorithms auto-suggested additional search terms to circumvent a block the platform did apply on links to one encrypted file-transfer service notorious for transmitting child-sex content.

The WSJ report also details how viewing just one underage seller account on Instagram led the platform to recommend users view new CSAM selling accounts.

“Following the company’s initial sweep of accounts brought to its attention by Stanford and the Journal, UMass’s [Brian Levine, director of the Rescue Lab] checked in on some of the remaining underage seller accounts on Instagram. As before, viewing even one of them led Instagram to recommend new ones. Instagram’s suggestions were helping to rebuild the network that the platform’s own safety staff was in the middle of trying to dismantle,” it added.

Meta warned it faces ‘heavy sanctions’ in EU if it fails to fix child protection issues on Instagram by Natasha Lomas originally published on TechCrunch

Source link

Share with your friends!

Products You May Like

Leave a Reply

Your email address will not be published. Required fields are marked *

x  Powerful Protection for WordPress, from Shield Security
This Site Is Protected By
Shield Security