This article has been indexed from blog.avast.com EN
Apple is taking steps to combat child sex abuse materials (CSAM), including implementing technology that will detect known-CSAM uploaded to iCloud; an iMessage feature that will alert parents if their child sends or receives an image with nudity; and a block if someone tries to search for CSAM-related terms on Siri or Search. The changes, which Apple says will be released in the US later this year, were first leaked via a tweet thread by a Johns Hopkins University cryptography professor who heard about them from a colleague. Apple has since confirmed the reports.
Read the original article: Unintended Risks Of Apple Child Protection Features | Avast