Apple Expands Its On-Device Nudity Detection to Combat CSAM

Instead of scanning iCloud for illegal content, Apple’s tech will locally flag inappropriate images for kids. And adults are getting an opt-in nudes filter, too.

This article has been indexed from Security Latest

Read the original article: