Apple drops controversial plans for child sexual abuse imagery scanning

本文共有3298个字。 # / a

Apple has ended the development of technology intended to detect possible child sexual abuse material (CSAM) while it’s stored on user devices, according to The Wall Street Journal.

That plan was unveiled last fall with an intended rollout for iOS 15, but backlash quickly followed as encryption and consumer privacy experts warned about the danger of creating surveillance systems that work directly from your phone, laptop, or tablet.

Related

  • Apple is adding end-to-end encryption to iCloud backups
  • Apple claims a new iMessage can alert you if state-sponsored spies are eavesdropping
  • Apple’s controversial child protection features, explained

As recently as last December, Apple said its plans on that front hadn’t changed, but now Apple software VP Craig Federighi says, “Child sexual abuse can be headed off before it occurs... That’s where we’re putting our energy going forward.” Asked directly about the impacts of expanding encryption on the work of law enforcement agents investigating crimes, he said, “ultimately, keeping customer’s data safe has big implications on our safety more broadly.”

Now the company is expanding end-to-end encryption to include phone backups and adding other new features aimed at preserving privacy and security while using iMessage and for data stored in iCloud.

Apple did roll out part of the technology it announced last fall, dubbed “communication safety in iMessage,” in the US as part of the iOS 15.2 update and to other countries this year, albeit with some tweaks from the original plan. It’s an opt-in feature for the Messages app, connected to the Family Sharing setup, that scans incoming and outgoing pictures for “sexually explicit” material to children’s accounts.

Image showing screenshots of the warnings in Apple Messages when it detects naked photos or videos.
Image showing screenshots of the warnings in Apple Messages when it detects naked photos or videos.
Apple says that “Messages can warn children when receiving or sending photos that contain nudity.”

Image: Apple

If it detects something that it thinks crosses that bar, the imagery is blurred, and it displays a pop-up message with guidance on getting help or blocking the sender. The original plan appeared to suggest it would also automatically notify parents of any detection, but as implemented, that is available as an option for the user.

版权声明:本文来源自网络,经修正后供个人鉴赏、娱乐,如若侵犯了您的版权,请及时联系我们进行删除!

添加新评论

暂无评论