Our mission is to build partnership with individuals and organizations if possible to bring about the desired result in the interest of children and young people. We are excited to bring to the notice of the general public updates on child safety from FACEBOOK in ensuring that young people of the right age are adequately catered for on social media and the other online platforms ranging from lessons they can use to build their Digital Citizenship to security.
Here are the details:
As you know, we do not tolerate any behavior or content that exploits children online and we develop safety programs and educational resources with more than 400 organizations around the world to help make the internet a safer place for children. For years our work has included using photo-matching technology to stop people from sharing known child exploitation images, reporting violations to the National Center for Missing and Exploited Children (NCMEC), requiring children to be at least 13 to use our services, and limiting the people that teens can interact with after they sign up.
On Wednesday we will share some of the work we’ve been doing over the past year to develop new technology in the fight against child exploitation. In addition to photo matching technology, we’re now using artificial intelligence and machine learning to proactively detect child nudity and previously unknown child exploitative content when it’s uploaded. We’re using this and other technology to more quickly identify this content and report it to NCMEC as well as accounts that engage in potentially inappropriate interactions with children on Facebook so that we can remove them and prevent additional harm.
We will also share for the first time the number of pieces of content on Facebook that were removed for violating our child nudity or sexual exploitation of children policies. Our Community Standards ban child exploitation and to avoid even the potential for abuse, we take action on nonsexual content as well, like seemingly benign photos of children in the bath. With this comprehensive approach, in the last quarter alone, we removed 8.7 million pieces of content on Facebook that violated our child nudity or sexual exploitation of children policies, 99 percent of which was removed before anyone reported it. We also remove accounts that promote this type of content. We have specially trained teams with backgrounds in law enforcement, online safety, analytics, and forensic investigations, which review content and report findings to NCMEC. In turn, NCMEC investigates and works with law enforcement agencies around the world to help victims, and we’re helping the organization develop new software to help prioritize the reports it shares with law enforcement in order to address the most serious cases first.
“The National Center for Missing & Exploited Children has been working with Facebook since 2006 in an effort to reduce child sexual abuse material online and ensure incidents are reported to our CyberTipline. Over this time, Facebook has consistently demonstrated their leadership and willingness to be a pro-active leader in the fight to keep the internet safer for everyone. While the amount of content taken down may be surprising, it is a reminder that the sexual exploitation of children is a global problem that demands a multi-faceted global solution,” said NCMEC’s Chief Operating Officer Michelle C. DeLaune.
We also collaborate with safety experts, NGOs and other companies to disrupt and prevent the sexual exploitation of children across online technologies. For example, we work with the Tech Coalition to eradicate online child exploitation, the Internet Watch Foundation, and the multi-stakeholder WePROTECT Global Alliance to End Child Exploitation Online. And next month, Facebook will join Microsoft and other industry partners to begin building tools for smaller companies to prevent the grooming of children online. You can learn more about all of our efforts at facebook.com/safety.
For more information regarding this update, reach Awo on 0272 001 005 or info@jighana.org .