The tech giant announced a new policy last week that uses technology to spot potential child abuse imagery in the iCloud and Messages. The Verge reports that Apple released a FAQ page in recent days that explains how the technology is used and what the privacy aspects look like after people voiced concerns over the new measures.
Getty Images/Yuriko Nakao
Apple said its technology is specifically limited to detecting child sexual abuse material (CSAM) and cannot be turned into surveillance tools.