Special: Apple’s child protection features pose a threat in its own right – sources

Apple Inc. The logo hangs at the entrance to the Apple Store on 5th October Avenue in Manhattan, New York, USA. REUTERS / Mike Segar / File Photo

SAN FRANCISCO, Aug. 12 (Reuters) – Apple’s move to scan US customers’ phones and computers for child pornography has been a tedious affair, with employees talking to their insiders in a company known for its secretive culture. Stimulating strong opposition from leading technology policy groups.

Apple employees flooded Apple’s internal Slack channel with more than 800 messages on the plan last week, employees who asked not to be identified told Reuters. Workers who have seen the thread for days express concern that the behavior could be used by oppressive governments to seek censorship or other means of arrest.

Apple’s security changes have also raised concerns among employees, but the magnitude and duration of the new dispute are surprising, officials said. Some posters worry that Apple is ruining its reputation by protecting privacy.

Although it comes primarily from employees who come outside of leadership safety and privacy roles, pressure protection shows a change in the company that is changing other aspects of corporate culture.

Slack was widely accepted as a group at Apple when the epidemic broke out a few years ago, two employees said. As employees share apps and other simple content and use the app to maintain social networking from work to home, even more serious conversations have taken place.

In the Slack thread for the feature of the photo scan, some employees have pushed back against criticism, while others say Slack is not the right platform for such discussions.

Core security officials did not appear to be the main complainants in the posts, and some thought the Apple solution was a reasonable response to pressure to install illegal materials.

Other employees said they hoped the scan was a step towards fully encrypting customers who wanted iCloud, which would change Apple’s approach for the second time.


Last week’s announcement was met with strong criticism from outside supporters that Apple would not accept a well-sold privacy story.

They say the US government has not been able to legitimize the illegal smuggling of large quantities of furniture, but Apple has been willing to do so.

Proponents of her case have been working to make the actual transcript of this statement available online. Two groups, the Electronic Frontier Foundation (EFF) and the Center for Democracy and Technology (CDT), have both issued new detailed objections to Apple’s plans in the past 24 hours.

In an interview last week, CDT project director Emma Lalanso said: Everything that you have said and done before seems out of place.

Apple declined to comment on this story. He rejected governments’ requests to test phones other than child sexual abuse.

In 2016, foreign and staff reported that Apple had successfully launched a new device to break the terror suspect’s iPhone. The company then said that such devices would inevitably be replaced by other devices.

But Apple is surprised that its position is less popular, and since then the global tide has become more focused on private communications.

With less advertising, Apple has made other technical decisions to help the authorities, including agreeing to encrypt widely used iCloud backups and store Chinese user data in that country.

A major problem with scanning images of Apple’s new child abuse images is that critics are making cautious policy decisions that could force the company to change, warning that it could happen in its current state. He broke the terrorist’s phone.

Apple allows one-on-one scans in the United States and other countries, only when images are set to be uploaded to iCloud, and at the National Center for Exploitation and Lost Children and for a small number of other groups.

But lawmakers or courts of any country may demand that one of those bodies be expanded, and critics say some countries, such as China, find it extremely difficult to resist the market.

Police and other agencies are citing recent legislation that requires “technical support” to investigate crimes, including in the UK and Australia, in order to expand Apple’s capacity.

FF general consultant Kurt Opsahl makes it difficult to say that “further monitoring of the infrastructure required by Apple to deliver the proposed changes is technically impossible.”

Legislators are also building on it, says Neil Brown, a British technology lawyer at Decoded College. Think of it in their own markets, and use it for a wider range of things.

By Joseph Main, Julia Love and Stephen Nellis in San Francisco; Edited by Kenneth Lee and Grant McColl

Our Standards – Thomson Reuters Trust Principles.


Leave a Comment