Apple plans to scan American iPhones Images of child sexual abuse, applause from child protection groups, but fears among some security researchers could be misused by governments seeking to monitor their citizens.
Apple’s messaging app uses machine learning on a device to warn you about confidential content without letting personal communications be read by the company. Apple’s “neuralMatch” tool detects images of child sexual abuse without decrypting messages. If a match is found, the image will be reviewed by a person who can notify law enforcement if necessary.
But researchers say the device could be used for other purposes, such as anti-government or anti-government surveillance.
Matthew Hopkins, a senior scribe, designed to frame harmless but malicious images for child sex, deceiving Apple’s algorithm and warning law enforcement – basically. People. “Researchers have been able to do this easily,” he said.
Technology companies, including Microsoft, Google, Facebook, and others, have been sharing “hash details” of popular child sexual abuse images. Apple has been scanning user files stored in the iCloud service, Which is not as securely encrypted as messages for such images.
Some say that this technology could expose the company to political pressure in dictatorships such as China. What happens when the Chinese government says, “Here is a list of files we want to scan”? “Does Apple say no? I hope not, but their technology does not refuse. ”
The company has been under pressure from governments and law enforcement to monitor encrypted information. Coming with security measures Apple has called for a more balanced approach to countering child exploitation by maintaining a high level of commitment to user privacy.
Apple believes that it has reduced that ability in consultation with a number of well-known psychologists, including Dan Bone, a Stanford University professor who won the often-known Nobel Prize in Technology.
A computer scientist who invented the technology used by law enforcement ten years ago to detect child pornography online has acknowledged the abuse of the Apple system but says it is far more important than it needs to fight child sexual abuse.
“Is it possible? of course. But is it something that worries me? No, ”says Honey Farid, a researcher at the University of California, Berkeley. For example, WhatsApp provides end-to-end encryption for users to protect their privacy, but it also uses a system to identify malware and prevent users from installing malicious links.
Apple was one of the first companies to adopt “end-to-end” encryption, which was crowded with messages for only senders and recipients to read. Law enforcement officials, however, have long pressed for information to investigate such crimes as terrorism or child sexual abuse.
“Apple’s expanded protection is a game changer for children,” said John Clark, president and CEO of the National Center for Lost and Exploited Children. For many people who use Apple products, these new security measures have the potential to save children who have been exposed to child pornography online.
Shorne’s general manager, Julia Cordua, said Apple technology “balances the importance of privacy with children’s digital security.” Founded by Demi Moore and Ashton Coocher, it uses technology to help protect children from sexual abuse by identifying nonprofits and working with technology platforms.
___
A.P. Technology writer Mike Ledetke contributed to this article.
.