No one wants to admit they have them. But there are a lot of private, intimate photos sitting on our phones. And as celebrities like Jennifer Lawrence learned not too long ago, privacy and the internet just don’t go together.
Perhaps artificial intelligence can help.
As a solution to prevent those photos from being seen by the wrong people, Jennifer Chiu and Y.C. Chen—two 21-year-olds from the University of California, Berkeley—and an Armenian developer built Nude, an app that searches a user’s iPhone for anything scandalous, then moves them into a secure vault on the phone itself.
The encrypted folder is protected by a PIN and the photos themselves are deleted from the camera roll and iCloud, where photos are often automatically backed up.
A key aspect of the Nude app is that it utilizes Core ML, a framework released with the latest version of iOS that allows AI apps to do their thing inside the phone. If Nude had to cross-reference over the cloud, the idea that it helps privacy rather than hurts it would quickly fall apart.
The app scans for flesh tones using an engine they built themselves; they found that using open-source data didn’t work so well for minorities. Chiu and Chen told The Verge they scraped sites like PornHub for 30 million images to power their algorithm and make it more accurate for people of all races. But there’s still work to be done. “If you have man boobs, those will be imported,” Chen told the site.
Read this next: The eclipse was so impressive, Americans stopped watching Netflix and porn