You are here: Home » Adult Webmaster News » Expert: FOSTA/SESTA Could Make Deepfake Porn Problem...
Select year   and month 
 
March 12, 2018

Expert: FOSTA/SESTA Could Make Deepfake Porn Problem Worse

CYBERSPACE—As the United States Senate gets ready to take up its own version of bill supposedly aimed at curbing online sex trafficking, a technology expert from a prominent Washington think tank warns that the bill, known as FOSTA/SESTA, could the make problem of online Artificial Intelligence-created fake porn videos known as “deepfakes” much more difficult to stop. The House version of the bill, the “Fight Online Sex Trafficking Act” or FOSTA passed by an overwhelming 388-25 margin last month. The Senate now considers its own “Stop Enabling Sex Traffickers Act” or SESTA. Civil liberties groups as well as sex worker rights organizations and even sex trafficking victims are leading efforts to stop the bill, which they argue will impose dangerous restrictions on free speech online—and even harm the very victims it purports to protect, while driving actual sex traffickers deeper into the darkness where law enforcement will have a much more difficult time finding and catching them. But an article published Monday on the technology news site Motherboard—as well as on the web site of the libertarian think tank R Street—raises a new problem with the bill. According to R Street senior fellow and associate director of tech & innovation Charles Duan, FOSTA/SESTA will make the deepfakes problem worse. “Deepfake” porn clips are videos generated by an artificial intelligence program in which the face of a person not in the video—usually a celebrity—is realistically superimposed onto an existing porn video. The AI program causes the superimposed face to plausibly change position and facial expression, making it appear that the celebrity has appeared in a porn film, or made an unusually explicit home sex tape. Deepfakes raise several troubling issues, both for the people whose faces are used non-consensually in the videos, but for the performers in the underlying videos whose work is used without permission. But the legality of deepfake porn remains murky. No laws now in place specifically target the use of artificial intelligence to create sophisticated, fake porn videos. As a result, the responsibility for regulating deepfake videos has fallen to adminsitrators of the sites on which the videos have appeared, such as the popular forum Reddit, which recently deleted numerous deepfake porn videos. But FOSTA/SESTA, albeit unintentionally, offers sites an incentive to stop monitoring content altogether, to avoid having the “knowledge” of illegal activity the proposed law would require in order to hold a site responsible for illegal content such as sex trafficking activity. “But even if websites don’t stop monitoring their content entirely, laws like FOSTA pressure them to focus on Congress’s issue of the day—in this case sex trafficking—at the expense of other problems online,” Duan wrote in his Monday article. “That’s where things like deepfake porn come into play.” In other words, FOSTA/SESTA could have two harmful results: either sites stop policing their own content entirely, or they focus their monitoring so heavily on “sex trafficking” activities that they miss other problems, such as AI fake porn. “Public pressure on internet companies is necessary to push those companies to do everything they can, but the sluggishness of the federal legislative process will be a drag on solving tomorrow’s online problems,” Duan wrote. “A law requiring websites to take down deepfake porn, for example, might take years to pass (and could be subject to court challenges), at which point some new and unpredictable abusive practice will likely have arisen." Above, a SFW image from a Scarlett Johansson deepfake video

 
home | register | log in | add URL | add premium URL | forums | news | advertising | contact | sitemap
copyright © 1998 - 2009 Adult Webmasters Association. All rights reserved.