You are here: Home » Adult Webmaster News » AI Algorithms Designed To Spot Nudes Mistake Desert...
Select year   and month 
 
December 13, 2018

AI Algorithms Designed To Spot Nudes Mistake Desert Sand For Skin

Software powered by artificial intelligence and used by the London Metropolitan Police to spot illegal or incriminating images on the computers and cell phones of criminal suspects has proven highly unreliable and prone to embarrassing errors,  and has a tendency to mistake images of desert sand for exposed human skin, according to a report by Britain’s Telegraph newspaper.   The Metropolitan Police digital forensics investigators use the algorithms to identify and grade images to determine whether or not they are illegal, such as images of children, or otherwise “indecent.” But the AI still has a lot of learning to do, according to what Mark Stokes, head of the digital team, told The Telegraph. "Sometimes it comes up with a desert and it thinks its an indecent image or pornography,” Stokes said. "For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin color."  The Metropolitan Police are hoping to have AI algorithms do the dirty work of examining illegal images of children, in order to spare the human police officers now charged with that unpleasant task the psychological trauma that comes along with repeated exposure to such images. But the software is still “two to three years” away from being ready to take on that task without completely screwing it up, according to The Telegraph. An investigation by the tech news site Gizmodo tested an app called “Nude,” which is intended to scan a user’s phone for nude or sexualized images, grouping them together for easy deletion. But in the test, the app flagged pictures of dogs, a cute (fully clothed) baby photo, a photo of a doughnut and an image of Grace Kelly in the Alfred Hitchcock thriller To Catch a Thief. The social media platform Facebook also employs AI technology to scan uploaded images for inappropriate content—but the software once flagged and deleted an iconic, Pultizer Prize-winning Vietman War photograph showing a naked young girl fleeing in terror from a napalm attack, identifying the image—one of the most famous and powerful war photographs ever taken—as “child pornography.” Facebook eventually allowed the image to be posted after an online protest in which thousands of Facebook users uploaded the Vietnam image to their own feeds. Photo by Wonker/Wikimedia Commons 

 
home | register | log in | add URL | add premium URL | forums | news | advertising | contact | sitemap
copyright © 1998 - 2009 Adult Webmasters Association. All rights reserved.