Begin typing your search above and press return to search.
proflie-avatar
Login
exit_to_app
The smouldering of anger in Ladakh
access_time 29 March 2024 4:20 AM GMT
Democracy that banks on the electorate
access_time 28 March 2024 5:34 AM GMT
Lessons to learn from Moscow terror attack
access_time 27 March 2024 6:10 AM GMT
Gaza
access_time 26 March 2024 4:34 AM GMT
The poison is not in words, but inside
access_time 25 March 2024 5:42 AM GMT
DEEP READ
Schools breeding hatred
access_time 14 Sep 2023 10:37 AM GMT
Ukraine
access_time 16 Aug 2023 5:46 AM GMT
Ramadan: Its essence and lessons
access_time 13 March 2024 9:24 AM GMT
When ‘Jai Sree Ram’ becomes a death call
access_time 15 Feb 2024 9:54 AM GMT
exit_to_app
Homechevron_rightTechnologychevron_rightApple assures it won't...

Apple assures it won't allow govts to spy using its child abuse detection tool

text_fields
bookmark_border
Apple assures it wont allow govts to spy using its child abuse detection tool
cancel

San Francisco: Amid drawing backlash over its new Child Safety features, Apple on Monday stressed that it will not allow any government to conduct surveillance via the tool aimed at detecting and curbing child sexual abuse material (CSAM) in iCloud photos.

The statement from the company comes a week after it rolled out a new child safety feature which instantaneously raised criticism from experts around the world stating it could be misused by governments to surveil citizens.

Last week, Apple confirmed plans to deploy new technology within iOS, macOS, watchOS, and iMessage that will detect potential child abuse imagery.

Apple said it will not accede to any government's request to expand the technology.

"Apple will refuse any such demands. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future," the company said in a new document.

Apple said the tool does not impact users who have not chosen to use iCloud Photos.

"There is no impact to any other on-device data. This feature does not apply to Messages," the company noted.

Epic Games CEO Tim Sweeney had attacked Apple over its iCloud Photos and messages child safety initiatives.

"This is government spyware installed by Apple based on a presumption of guilt. Though Apple wrote the code, its function is to scan personal data and report it to the government," Sweeney posted on Twitter.

WhatsApp Head Will Cathcart had also slammed Apple over its plans to launch photo identification measures, saying the Apple software can scan all the private photos on your phone which is a clear privacy violation.

Stressing that WhatsApp will not allow such Apple tools to run on his platform, Cathcart said that Apple has long needed to do more to fight child sexual abuse material (CSAM), "but the approach they are taking introduces something very concerning into the world".

Apple said that 'CSAM detection in iCloud Photos' tool is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images.

"This technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it".

The company further said that the feature does not work on private iPhone photo library on the device.

Show Full Article
TAGS:#SurveillanceApple ioschild safety
Next Story