�
You are here: Home » Adult Webmaster News » Apple Wants To Snoop On Your Private Photos
Select year   and month 
 
August 10, 2021

Apple Wants To Snoop On Your Private Photos

CUPERTINO, Calif. — Security researchers and privacy experts are concerned that a new initiative from tech giant Apple to detect CSAM (“child sexual abuse materials”) and other illegal content on user devices could lead to a catastrophic failure in new corporate policy. The London-based Financial Times reports that Apple intends to adopt a brand new “client-side” monitoring algorithm to detect potentially exploitative content including CSAM material and other exploitation.

“Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts,” notes Apple in a statement about the program. “New communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content while keeping private communications unreadable by Apple.”

While Apple points out that “private communications” will remain “unreadable,” the issue at hand is the detection of content that would merit a reporting to the company’s servers, and potentially the authorities if justification exists. Matthew D. Green, an associate professor at the John Hopkins University Information Security Institute, tweeted that the move by Apple to implement CSAM scanning in such a fashion is extremely problematic.

“This is a really bad idea,” Green argues. “These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear. Initially, I understand this will be used to perform client-side scanning for cloud-stored photos. Eventually, it could be a key ingredient in adding surveillance to encrypted messaging systems.”

Edward Snowden, the world-famous whistleblower who exposed the widespread illegal snooping on U.S. citizens by the National Security Agency, added on to the criticism against Apple’s new plan to scan for CSAM.

“No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this,” Snowden said, according to a report by Mediaite.com. “Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs.”

As a consequence of the announcement, several in the adult industry note the concerns with understandable fear and skepticism. For example, adult entertainment industry attorney Michael Fattorosi tweeted that Apple’s intention to further monitor private photographs is “very scary.”

A Twitter user going by “SexyAngie” replied to Fattorosi after he shared a report on Apple’s announcement from another news outlet: “I look young for my age and there are dirty pictures of me in my phone, lots of them. What happens when if those photos get flagged? Will the FBI show up at my house? At my job? All it takes is an accusation and your life is ruined.”

Fattorosi replied: “I don’t think we will know the answers to your questions until Apple starts this dumpster fire scanning.”

iPhone photo by Tyler Lastovich of Pexels



 
�
�
�
home | register | log in | add URL | add premium URL | forums | news | advertising | contact | sitemap
copyright © 1998 - 2009 Adult Webmasters Association. All rights reserved.