Apple explained at CES this week that photos that are uploaded to iCloud are automatically scanned for illegal content as part of the company’s fight against child abuse images.
Technologies like PhotoDNA have long been used by tech giants to check the content that users uploaded to the cloud, and Apple says that it’s using such a system as well to make sure that child abuse material is blocked.
Jane Horvath, Apple’s chief privacy officer, explained at CES that the automatic checks are performed using “matching technology,” which suggests that a PhotoDNA-like system is indeed employed.
Powered by Microsoft
PhotoDNA uses hashing to check if newly-uploaded photos match content that has previously been flagged as illegal.
“Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. As part of this commitment, Apple uses image matching technology to help find…
Advertise on IT Security News.
Read the complete article: Apple Says All Your iCloud Photos Are Scanned for Illegal Content