One thing you won’t be able to use Facebook’s new WhatsApp cryptocurrency for is buying or selling revenge porn on the social media platform. Facebook, the company revealed today, just made an AI that’s an expert on porn.
It’s like U.S. Supreme Court Justice Potter Stewart who famously said his threshold test for pornographic obscenity in Jacobellis v. Ohio (1964) was “I know it when I see.”
Amazingly, Facebook’s artificially intelligent software algorithm is wired up to be aroused by the same visual cues in photographs that sexually arouse male (and female ) viewers.
Facebook announced the new AI Friday as a countermeasure to sexually abusive content, as the company pivots to become the platform you use for privacy.
With the ability to detect nude and “near-nude,” intimate photos, the robo-content cop can help stop malicious users from posting or selling people’s nude and intimate selfies.
This can happen without the knowledge or permission of the person in the photo because the poster doesn’t care, or it can be done with deliberate intent to embarrass the victim.
But Facebook’s new revenge porn and graphic content dragnet is sophisticated enough to block not only nude photographs but also any content that a person would recognize as serving a prurient interest, Facebook claims. They really want to make this point clear.
Facebook’s Global Head of Safety Antigone Davis said in a blog post :
“Finding these images goes beyond detecting nudity on our platforms. By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram.”
“A lingerie shot, perhaps,” Sara Salinas suggests at CNBC.
The strangest part of Facebook’s revenge porn blocker is that the Silicon Valley giant is seriously asking users to preemptively upload image files of their nude selfies so Facebook can auto-block them. What could possibly go wrong? It’s like an Onion headline, but it’s CNBC :
“Facebook tests fighting revenge porn by asking users to file nude photos first.”
“Users worried an inappropriate image might appear on Facebook’s platforms are asked to send an intimate image via Messenger, a preventive measure designed to flag the images before they’re shared.”
As part of the program, Facebook users upload pictures to a “secure, one-time upload link,” which will then be reviewed by a “handful of specially trained members of our Community Operations Safety Team,” according to Facebook .
Who are the people working in Facebook’s nude selfie reviewer department, and did they already have to have experience working at the NSA to get an interview?
In all seriousness, Facebook needs to make it clear whether or not human eyes are reviewing these nude selfies that get preemptively uploaded, because it sounds like they are.