BBC:

Apple has announced details of a system to find child sexual abuse material (CSAM) on customers‘ devices.

Before an image is stored onto iCloud Photos, the technology will search for matches of already known CSAM.

Apple said that if a match is found a human reviewer will then assess and report the user to law enforcement.

However there are privacy concerns that the technology could be expanded to scan phones for prohibited content or even political speech.

„there are privacy concerns that the technology could be expanded“?? Without expanding any technology this system as it stands declares that „your“ phone now vets each image you „own“ and notifies state authorities if you attempt to store a photo which is on the list of corporate-forbidden images.

BBC:

A Turkish social media influencer says she’s being prosecuted in her country for posting „joke“ photos inside the world-famous Sex Museum in Amsterdam.

Merve Taskin, 23, shared pictures of sex toys she bought at the museum during a birthday trip to the Netherlands in January last year.

A few months later she says she was arrested in Turkey, where sharing obscene content is considered a crime.

Bookmark the permalink.