In August 2021, Apple discovered child sexual abuse material on the devices of its users in the USA. CSAM We would like to inform you that we have developed a system to detect we had transferred. The system aimed to identify known CSAM contents in iCloud photos with digital signatures. However, this technology was shelved due to concerns that it could be misused for government surveillance.
According to the New York Times according toOver the past three years, Apple’s failure to implement the system due to privacy concerns has caused outrage from victims. The lawsuit was filed by a 27-year-old woman. The woman stated that she was abused by a relative when she was a baby and these images were shared online. He stated that he still receives notifications from law enforcement almost every day about new criminal proceedings against people who possess these images.
Plaintiff’s attorney, James Marsh, potentially 2 bin 680 He said the victim could seek compensation in this case. Apple stated in its statement that they continue to fight such crimes without compromising privacy and security.
This isn’t the first lawsuit filed against Apple. In August, a 9-year-old girl and her guardian filed a lawsuit against the company on similar grounds.
This issue is quite critical. It represents the delicate balance between preventing child abuse and protecting user privacy. Victims argue that Apple’s failure to implement the system failed to prevent the spread of abusive content. Privacy advocates, however, worry that such a system could be misused for government surveillance.
Source link: https://webrazzi.com/2024/12/09/applea-cocuk-istismarini-onleme-sistemini-uygulamadigi-gerekcesiyle-dava-acildi/
Web sitemizde ziyaretçilerimize daha iyi hizmet sağlayabilmek adına bazı çerezler kullanıyoruz. Web sitemizi kullanmaya devam ederseniz çerezleri kabul etmiş sayılırsınız.
Gizlilik Politikası