this post was submitted on 20 Oct 2023
124 points (97.7% liked)

Privacy

31871 readers
454 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

"A company which enables its clients to search a database of billions of images scraped from the internet for matches to a particular face has won an appeal against the UK's privacy watchdog.

Last year, Clearview AI was fined more than £7.5m by the Information Commissioner's Office (ICO) for unlawfully storing facial images.

Privacy International (who helped bring the original case I believe) responded to this on Mastodon:

"The first 33 pages of the judgment explain with great detail and clarity why Clearview falls squarely within the bounds of GDPR. Clearview's activities are entirely "related to the monitoring of behaviour" of UK data subjects.

In essence, what Clearview does is large-scale processing of a highly intrusive nature. That, the Tribunal agreed.

BUT in the last 2 pages the Tribunal tells us that because Clearview only sells to foreign governments, it doesn't fall under UK GDPR jurisdiction.

So Clearview would have been subject to GDPR if it sold its services to UK police or government authorities or commercial entities, but because it doesn't, it can do whatever the hell it wants with UK people's data - this is at best puzzling, at worst nonsensical."

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -1 points 1 year ago

Indeed: spend enough time and effort and anybody can be deanonymized and fully documented. The point is that privacy-conscious individuals should make it as difficult to automate as possible.

Clearview - and to a large extent all the other corporate surveillance players - go primarily for the low hanging fruits: people who post selfies with their names attached or don't remove the EXIF data, tagged group photos and such. Bots can easily scrape those. If you go out of your way to either not provide that data in the first place, or pollute the well by providing fake photos and/or fake names attached, you make it harder for big data to exploit your data.

It's still possible, just less likely unless you're a high value target - and realistically, most people aren't.