Meta adopts Lantern program to Protect Children Online

Facebook and other social sites run by Meta will use Photo DNA and PDQ technology to stop child abuse image and videos in the Lantern program.

Meta uses Photo DNA as Lantern Adoption
Meta wants young people to have secure. Image: Collected

Use of Social media sites by children cannot be stopped as they take parents devices as amusement items but there are risks involved child safety. In this regard, Meta is going to adopt child safety technology to stop spreading unsafe visual contents.

Meta authority said, "Protecting children online is one of the most important challenges facing the technology industry today. At Meta, we want young people to have safe, positive experiences online." 

Meta spent a decade developing tools and policies designed to protect them. As a result, the company find and report more child sexual abuse material to the National Center for Missing & Exploited Children (NCMEC) than any other service today. 

Meta urged the digital world to work together to protect children and stop predators. They use Microsoft’s PhotoDNA and Meta’s PDQ technology to stop the spread of child sexual abuse material (CSAM) on the internet.

But this is not enough. This social media giant thinks world needs additional solutions to stop predators from using different apps and websites to target young aged groups.


The company mentioned that predators don’t limit their attempts to harm children to their own and third party platforms. They use multiple apps and websites and adapt their tactics across them all to avoid detection.

Meta also said, "We’re announcing our participation in the Lantern program, which enables digital services to share signals about accounts and behaviors that violate their child safety rules."

Meta was a founding member of Lantern. It provided the Tech Coalition with the technical infrastructure that sits behind the program and continue to maintain it.

According to Meta, "We hope others in the industry will join us to expand this important work."

The interesting thing is- this technology will investigate this thing on its own. 

In accordance with the legal obligations, Meta reported the violating profiles, pages and accounts to NCMEC. In addition, it shared details of the investigation back to Lantern, enabling participating companies to use the signals to conduct their own tests.

Meta expressed the hope, "We’re glad to partner with the Tech Coalition and our peers on the Lantern program, and we hope others in the industry will join us to expand this important task."

Post a Comment

Previous Post Next Post

Contact Form