The issue of child victims of abuse online has become a pressing concern for regulators and tech companies alike. In response, major players in the tech industry, such as Facebook-owner Meta and Google, have recently announced their collaboration in a new program called Lantern. With growing public scrutiny surrounding their efforts to protect children and teenagers online, these companies are eager to demonstrate their commitment to combatting online child sexual abuse and exploitation.
The central goal of the Lantern program is to facilitate a comprehensive and coordinated response to the problem of child exploitation on various online platforms. By sharing signals of activity that violate their policies on child exploitation, companies can more effectively identify and address problematic content, swiftly detect potentially harmful situations, and report instances of abuse. These signals may include email addresses, specific hashtags, or keywords frequently associated with grooming, as well as the buying and selling of material related to child abuse and exploitation.
One crucial aspect of Lantern is its focus on addressing predatory actors who seek to evade detection across multiple services and platforms. The Tech Coalition, an organization that brings together tech companies to tackle this issue, emphasizes the importance of filling the existing gap in collaborative procedures. According to Sean Litton, the executive director of the Tech Coalition, Lantern’s purpose is to shed light on cross-platform attempts at online child sexual exploitation and abuse, contributing to the creation of a safer online environment for children.
The initial pilot of the Lantern program yielded promising results. Facebook’s Meta, in cooperation with privacy-focused platform Mega from New Zealand, successfully removed over 10,000 profiles, pages, and Instagram accounts that violated their policies on child exploitation. The data shared by Mega enabled Meta to identify and take action against these accounts. Furthermore, Meta promptly reported the findings to the US-based National Center for Missing & Exploited Children and shared the information with other platforms to aid in their own investigations.
Antigone Davis, the Global Head of Safety at Meta, highlights the fact that predators do not limit their harmful attempts to harm children to specific platforms. In response to this alarming reality, Davis stresses the necessity for the technology industry as a whole to join forces in the fight against child exploitation. Cooperation between companies, apps, and websites is crucial to effectively safeguard children and prevent further harm.
Despite the announcement of the Lantern program, the tech industry continues to face criticisms regarding their handling of child safety concerns. On the same day as the Lantern program’s launch, a former senior engineer at Meta, Arturo Bejar, testified in a Senate hearing in Washington. Bejar alleged that top executives, including Mark Zuckerberg, had disregarded his warnings about the safety risks faced by teenagers on the company’s platforms. He revealed that an internal survey conducted by Meta on Instagram showed that 13 percent of 13-15-year-olds had received unwanted sexual advances on the platform within the past week. Bejar’s testimony raises serious doubts about the effectiveness of the measures taken by Meta and highlights the urgent need for improved child protection practices.
The joint efforts of big tech companies through initiatives like Lantern demonstrate a proactive approach to combatting online child abuse. By sharing information and collaborating on detecting and removing problematic content, these companies strive to provide a safer online environment for children and teenagers. However, ongoing criticisms and challenges serve as reminders that more work must be done to address the issue effectively and protect vulnerable young users from exploitation and harm.