Why Apple's crackdown on child abuse images is no easy decision

Apple will inspect photos uploaded to the cloud by iPhones in the US to detect images of child sexual abuse – a move praised by child welfare charities but condemned by privacy campaigners

Why Apple's crackdown on child abuse images is no easy decision
Technology 10 August 2021

By Matthew Sparkes

apple iphone and macbook

An Apple iPhone 12 and an Apple MacBook Pro

Stanislav Kogiku/SOPA Images/Shutterstock

Apple volition inspect each photograph uploaded to the unreality by US users of iPhones and iPads to observe images of kid intersexual abuse, and volition study immoderate recovered to a nonprofit that investigates cases of kid exploitation. The caller measurement has been praised by kid payment charities but condemned by privacy campaigners, who judge it opens the doorway to different types of surveillance from authoritarian governments.

Rather than examining the photographs themselves, Apple’s neuralMatch software volition see an algorithm that creates …

Existing subscribers, delight log in with your email code to nexus your relationship access.

Paid quarterly

Inclusive of applicable taxes (VAT)

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow