Trending

Why Facebook Wants Your Nude and Intimate Photos

facebook-logo-getty-images.png

Meta, the company previously known as Facebook, has taken another step in its battle to stop the spread of intimate or nude photos of people without their consent on social media. The company teamed up with the nonprofit U.K. Revenge Porn Helpline last week to launch StopNCII.org to help stop the non-consensual sharing of intimate images (NCII). The website allows those concerned their images could be shared without their permission to upload them to StopNCII.org to begin the process of protecting those images from being shared.

When a person visits StopNCII.org, they can create a case, which will then ask a person to upload an image showing them in the nude, nearly nude, or in an intimate position that they believe could be shared by someone else without their consent. StopNCII.org then converts the image into a one-of-a-kind digital footprint known as “hashes.” The hash is then sent to participating companies, like Facebook and Instagram, who will then use that data to make sure the photos could not be shared, explains NBC News.

Videos by PopCulture.com

This is different than a previous program Facebook tried in Australia in 2017, Mashable points out. That program was highly controversial as it asked users to send nude images to Facebook, which would then have human moderators individually review them. Of course, that raised plenty of concerns about privacy, but Facebook insisted those images would only be saved for a short time. That pilot program didn’t last long, and Facebook launched an AI tool in 2019 to pull nude images from the site.

The new StopNCII.org tool is intended to give survivors more power in the process. The images will never actually leave a person’s device. Instead, it is only the data hash that is stored by participating platforms. The site was developed with 50 global partners that focus on image-based abuse, women’s rights, and online safety.

“It’s important that Facebook and industry recognize that they can’t be at the front of this,” Sophie Mortimer, U.K. Revenge Porn Helpline’s manager, told NBC News, adding that the entire tech industry has a questionable reputation with privacy. “A hash bank needs to be held independently in a neutral space, and we have a lot of public trust and long-standing track record of helping people affected by the sharing of intimate images without consent,” she said, praising Meta for helping fund and develop the tool then “step away.”

There will still be some human involvement in the process. Mortimer told NBC News there are human content moderators who will check to see non-violating images wouldn’t be taken down so people couldn’t misuse the system. Once a case begins, a person who submitted material can also withdraw their participation.

Karuna Nain, Meta’s director of global safety policy, also noted there is a flaw in the system. If someone tweaks an image, it could create a whole new digital footprint and would evade detection. “There’s a lot we will have to watch,” Nain told NBC News. “How can we make the system even more victim-centric? Our work doesn’t end here.”