Thanks Thanks:  0
Likes Likes:  0
Results 1 to 2 of 2
  1. #1
    Respected zeekboy's Avatar
    Join Date
    Jun 2017
    Location
    Canada
    Posts
    3,047
    Post Thanks
    Chats
    230
    Rep Power
    14

    Apple to scan iPhones for child sex abuse images

    Hidden Content
    Apple has announced details of a system to find child sexual abuse material (CSAM) on US customers' devices.Before an image is stored onto iCloud Photos, the technology will search for matches of already known CSAM.Apple said that if a match is found a human reviewer will then assess and report the user to law enforcement.However there are privacy concerns that the technology could be expanded to scan phones for prohibited content or even political speech.Experts worry that the technology could be used by authoritarian governments to spy on its citizens.Apple said that new versions of iOS and iPadOS - due to be released later this year - will have "new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy".The system works by comparing pictures to a database of known child sexual abuse images compiled by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organisations.Those images are translated into "hashes", numerical codes that can be "matched" to an image on an Apple device.Apple says the technology will also catch edited but similar versions of original images.

    'High level of accuracy'
    "Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes," Apple said.The company claimed the system had an "extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account".Apple says that it will manually review each report to confirm there is a match. It can then take steps to disable a user's account and report to law enforcement.The company says that the new technology offers "significant" privacy benefits over existing techniques - as Apple only learns about users' photos if they have a collection of known CSAM in their iCloud Photos account.However some privacy experts have voiced concerns."Regardless of what Apple's long term plans are, they've sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users' phones for prohibited content," Matthew Green, a security researcher at Johns Hopkins University, said."Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone."

  2. #2
    Respected zeekboy's Avatar
    Join Date
    Jun 2017
    Location
    Canada
    Posts
    3,047
    Post Thanks
    Chats
    230
    Rep Power
    14
    Apple says it will refuse gov’t demands to expand photo-scanning beyond CSAM

    Apple has faced days of criticism from security experts, privacy advocates, and privacy-minded users over the plan it announced Thursday, in which iPhones and other Apple devices will scan photos before they are uploaded to iCloud. Many critics pointed out that once the technology is on consumer devices, it won't be difficult for Apple to expand it beyond the detection of CSAM in response to government demands for broader surveillance. Governments have been pressuring Apple to install backdoors into its end-to-end encryption system for years, and Apple acknowledged that governments are likely to make the exact demands that security experts and privacy advocates have been warning about. In a FAQ released today with the title, "Expanded Protections for Children," there is a question that asks, "Could governments force Apple to add non-CSAM images to the hash list?"

    Apple answers the question as follows:
    Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC (National Center for Missing and Exploited Children) and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

    None of this means that Apple lacks the ability to expand the technology's uses, of course. Answering the question of whether its photo-scanning system can be used to detect things other than CSAM, Apple said that it "is designed to prevent that from happening.""CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations," Apple said. "There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos."

    Apple says it won’t inject other photos into database
    But the system's current design doesn't prevent it from being redesigned and used for other purposes in the future. The new photo-scanning technology itself is a major change for a company that has used privacy as a selling point for years and calls privacy a "fundamental human right."Apple said the new system will be rolled out later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey, and will initially be deployed in the US only. The current plan is for Apple devices to scan user photos and report those that match a database of known CSAM image hashes. The Apple FAQ implicitly acknowledges that hashes of other types of images could be added to the list, but the document says Apple won't do that."Can non-CSAM images be 'injected' into the system to flag accounts for things other than CSAM? Our process is designed to prevent that from happening," Apple wrote. "The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of known CSAM image hashes."Apple also said the new "feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos." Apple's FAQ didn't say how many people use iCloud Photos, but it is a widely used feature. There are over 1 billion iPhones actively used worldwide, and a 2018 estimate by Barclays analysts found that iCloud (including all services, not just iCloud Photos) had 850 million users.

    Apple memo called privacy advocates “screeching voices”
    Apple does not seem to have anticipated the level of criticism its decision to scan user photos would receive. On Thursday night, Apple distributed an internal memo that acknowledged criticism but dismissed it as "screeching voices of the minority."That portion of the memo was written by NCMEC Executive Director of Strategic Partnerships Marita Rodriguez. "I know it's been a long day and that many of you probably haven't slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority. Our voices will be louder. Our commitment to lift up kids who have lived through the most unimaginable abuse and victimizations will be stronger," Rodriguez wrote.The memo was obtained and published by 9to5Mac. The Apple-written portion of the memo said, "We've seen many positive responses today. We know some people have misunderstandings, and more than a few are worried about the implications, but we will continue to explain and detail the features so people understand what we've built."

    Open letter warns of expanding surveillance uses
    Over 6,000 people signed an open letter urging Apple to reverse course, saying, "Apple's current path threatens to undermine decades of work by technologists, academics and policy advocates towards strong privacy-preserving measures being the norm across a majority of consumer electronic devices and use cases."
    The letter quoted several security experts, including researcher Nadim Kobeissi, who wrote, "Reminder: Apple sells iPhones without FaceTime in Saudi Arabia, because local regulation prohibits encrypted phone calls. That's just one example of many where Apple's bent to local pressure. What happens when local regulation mandates that messages be scanned for homosexuality?"The letter also quotes Johns Hopkins University cryptography professor Matthew Green, who said, "The pressure is going to come from the UK, from the US, from India, from China. I'm terrified about what that's going to look like. Why would Apple want to tell the world, 'Hey, we've got this tool'?"

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •