this post was submitted on 25 Sep 2023
222 points (95.5% liked)

Technology

59446 readers
3474 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A viral TikTok account is doxing ordinary and otherwise anonymous people on the internet using off-the-shelf facial recognition technology, creating content and growing a following by taking advantage of a fundamental new truth: privacy is now essentially dead in public spaces.

The 90,000 follower-strong account typically picks targets who appeared in other viral videos, or people suggested to the account in the comments. Many of the account’s videos show the process: screenshotting the video of the target, cropping images of the face, running those photos through facial recognition software, and then revealing the person’s full name, social media profile, and sometimes employer to millions of people who have liked the videos. There’s an entire branch of content on TikTok in which creators show off their OSINT doxing skills—OSINT being open source intelligence, or information that is openly available online. But the vast majority of them do it with the explicit consent of the target. This account is doing the same, without the consent of the people they choose to dox. As a bizarre aside, the account appears to be run by a Taylor Swift fan, with many of the doxing videos including Swift’s music, and including videos of people at the Eras Tour.

404 Media is not naming the account because TikTok has decided to not remove it from the platform. TikTok told me the account does not violate its policies; one social media policy expert I spoke to said TikTok should reevaluate that position.

The TikTok account, conversations with victims, and TikTok’s own lack of action on the account show that access to facial recognition technology, combined with a cultural belief that anything public is fair game to exploit for clout, now means that all it takes is one random person on the internet to target you and lead a crowd in your direction.

One target told me he felt violated after the TikTok account using facial recognition tech targeted him. Another said they initially felt flattered before “that promptly gave way to worry.” All of the victims I spoke to echoed one general point—this behavior showed them just how exposed we all potentially are simply by existing in public.

...

all 47 comments
sorted by: hot top controversial new old
[–] [email protected] 61 points 1 year ago (1 children)

This has, disappointingly, always been the inevitable truth. The same technology that propels us also further binds us. Today, it’s facial recognition. Well before that, our phone and web data, license plate scanners, employee badges, credit card usage could reasonably track and expose us.

The tech itself isn’t harmful, it’s how it’s used that causes harm. People need and absolutely deserve protection - perhaps through legislation. The unfortunate thing is that responding to hysteria only handicaps technology we don’t yet fully understand.

[–] [email protected] 17 points 1 year ago* (last edited 1 year ago) (2 children)

technology that propels us also further

How does that apply to facial recognition tech? Its sole purpose is surveillance. Yeah and the negligible added convience to unlock your phone without touching it. I guess that justifies it to many.

[–] [email protected] 16 points 1 year ago (1 children)

Well, it is nice to automatically label who's who in my photo collection...

[–] [email protected] 6 points 1 year ago

Or for legitimate surveillance reasons, like telling me if someone is on my property who is unregistered or who I've told it to alert me about.

[–] [email protected] -4 points 1 year ago (1 children)

Yeah and the negligible added convience to unlock your phone without touching it. I guess that justifies it to many.

I've only ever had android phones, mostly the google-brand because they were cheap and no-frills android. It has always struck me as bizarre that people would use their face to unlock their device. I know it's gotten better but that seems way easier to spoof than even fingerprints. I've seen several episodes on TV where this has literally happened so it's not even an unknown fear.

[–] [email protected] 17 points 1 year ago

Ignoring any spoofing concerns, in the US biometrics aren't universally covered under the 5th ammendment so police can force you to unlock your devices if they are secured with them vs a password.

https://www.concordlawschool.edu/blog/constitutional-law/fifth-amendment-biometrics/

If you have evidence of illegal things on your devices this isn't something you'll want. If you don't have any evidence on your devices this is something you still shouldn't want.

[–] [email protected] 28 points 1 year ago (4 children)

There is open source face recognition tools ?!

[–] [email protected] 36 points 1 year ago (1 children)

And remember: today is the worst that AI will ever be.

[–] [email protected] 7 points 1 year ago (2 children)

Least-capable, maybe. I don't know about worst.

[–] [email protected] 3 points 1 year ago (1 children)

Compared to the future? Worst = least capable

[–] [email protected] 4 points 1 year ago

I guess they were hinting at an AI dystopia which would be "the worst".

[–] [email protected] 1 points 1 year ago

Assuming they're being sarcastic

[–] [email protected] 6 points 1 year ago

If such tools are going to exist, I suppose I'd rather it be in everyone's hands not just the state and the powerful.

[–] [email protected] 4 points 1 year ago

Amassing a huge dataset to search through with all the metadata (usernames, names, etc) is the part than an individual would probably have trouble with doing, not the actual "is this a photo of the person in this other photo" part.

[–] [email protected] 2 points 1 year ago

You can run your own neural nets and things now. Especially with multiple gpus and parallel programming.

Scary potential really!

[–] [email protected] 23 points 1 year ago* (last edited 1 year ago)

Welcome to the future [of shit]!

[–] [email protected] 15 points 1 year ago (2 children)

Just delete your social media accounts and use some form of adblocker/tracker blocker, and voilà, privacy restored to some degree

[–] [email protected] 28 points 1 year ago (4 children)

This only sorta works for today and if your friends never share images or videos online. The ever-increasing amount of people taking pictures and filming and posting them online means the day is quickly approaching where you could be identified and tracked through other people's content, security & surveillance cameras, etc.

If stores start adopting the tracking used at Walmart and the Amazon biometric data, social media will be the last of your worries.

[–] [email protected] 18 points 1 year ago (1 children)

It doesn't even have to be your friends. It could just be you walking by in the background of a photo someone else took.

[–] [email protected] 7 points 1 year ago (1 children)

If you don't have a social media profile for them to cross reference that background pic they can't do you. Maybe they can link it to your drivers licence but I doubt they have open access to that sort of database.

[–] [email protected] 1 points 1 year ago

The problem is people you know may still have social media and they can tag you or include your info in the descriptions, even if you don't have a profile. Companies collecting this info from social media can totally build a shadow profile if they want, especially if they've got like 5 photos that have a matching face and the description has the same tag or name in it.

Regarding the DMV thing, it looks like they don't sell the photos but do sell other data that may be useful in cross-referencing things: https://www.vice.com/en/article/43kxzq/dmvs-selling-data-private-investigators-making-millions-of-dollars

[–] [email protected] -3 points 1 year ago

It’s a matter of difficulty. By deleting all this you make it MUCH harder to get accurate info.

[–] [email protected] 14 points 1 year ago

Use your real name and photo online, they said. What could go wrong, they said.

[–] [email protected] 9 points 1 year ago* (last edited 1 year ago) (2 children)

“I don’t immediately see anything illegal since folks are captured in public”

doxing

._.

[–] [email protected] 3 points 1 year ago (2 children)

I don't think doxxing is generally illegal unless A) it is used for harassment or stalking, or B) it is done with illegally obtained material.

Not a lawyer though, and I might not be up on the latest laws.

And of course it could vary by jurisdiction.

[–] [email protected] 8 points 1 year ago

Where I am, doxxing is illegal. What this person in the article has done can get them arrested.

[–] [email protected] -1 points 1 year ago

Avatar checks out

[–] [email protected] 2 points 1 year ago (1 children)

I mean, you buy a ticket from Ticketmaster who probably sells your data to third parties, then you slow up at the show with a ticket probably bought with a credit card or debit card. All the companies know exactly who you are and where you are at all times. This has been true for at least the last decade. You're cell carrier knows exactly where you are just based on triangulation between network sectors. If you're in public you're being tracked heavily. This is worrisome, but it's nothing new. It's just very visual so it gets the point across better.

[–] [email protected] 6 points 1 year ago

I agree but corporations have one motive, profit. Whereas individuals on social media have unknown motives that can be dangerous in many other ways.