A new program called Lantern aims to fight online child sexual exploitation and abuse (OCSEA) by exchanging cross-platform signals between online companies like Meta and Discord. The tech Coalition, a group of technology companies with a cooperative goal to fight online child sexual exploitation, wrote in today’s announcement that the program is an attempt to prevent predators from avoiding detection by moving potential victims to other platforms.
Lantern serves as a central database for companies to contribute data and compare their own platforms. When companies see signs such as known email addresses or usernames that violate OCSEA policy, child sexual abuse material (CSAM) hashes, or CSAM keywords, they can flag them in their own systems. The announcement notes that while the signs do not strictly prove abuse, they help companies investigate and possibly take action such as closing an account or reporting the activity to authorities.
Meta wrote in a blog post announcing its participation in the program that, during Lantern’s pilot phase, used information shared by one of the program’s partners, Mega, to remove “over 10,000 offending Facebook profiles, pages, and Instagram accounts.” ” and report them to the National Service. Center for Missing and Exploited Children.
The coalition’s announcement also quotes John Redgrave, Discord’s head of trust and safety, who says: “Discord has also acted on data points shared with us through the program, which has assisted in many internal investigations.”
Companies participating in Lantern so far include Discord, Google, Mega, Meta, Quora, Roblox, Snap, and Twitch. Coalition members have been developing Lantern for the past two years, and the group says that in addition to creating technical solutions, it had to put the program through an “eligibility check” and make sure it meets legal and regulatory requirements and that is “ethically compatible.” ”
One of the big challenges with programs like this is making sure they are effective and don’t introduce new problems. In a 2021 incident, police investigated a father after Google flagged him on CSAM for images of his son’s groin infection. Several groups warned that similar problems could arise with Apple’s now-discontinued automated iCloud Photo Library CSAM scanning feature.
The coalition will oversee Lantern and says it is responsible for establishing clear guidelines and rules for data sharing. As part of the program, companies must complete mandatory training and routine checks, and the group will review its policies and practices periodically.