this post was submitted on 22 Feb 2024
1018 points (98.7% liked)

Technology

60055 readers
3163 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 10 months ago (1 children)

Where does it say they have access to PII?

So technically they haven't sold any PII if all they do is provide IP addresses. Legally an IP address is not PII. Google knows all our IP addresses if we have an account with them or interact with them in certain ways. Sure, some people aren't trackable but i'm just going to call it out that for all intents and purposes basically everyone is tracked by google.

Only the most security paranoid individuals would be anonymous.

[–] [email protected] 4 points 10 months ago (1 children)

Depends where and how its applied.
Under GDPR, IP addresses are essential to the opperation of websites and security, so the logging/processing of them can be suitably justified without requiring consent (just disclosure).
Under CCPA, it seems like it isnt PII if it cant be linked to a person/household.

However, an ip address isnt needed as a part of AI training data, and alongside comment/post data could potentially identify a person/household. So, seems risky under GDPR and CCPA.

I think Reddit would be risking huge legal exposure if they included IP addresses in the data set.
And i dont think google would accept a data set that includes information like that due to the legal exposure.

[–] [email protected] 2 points 10 months ago (1 children)

ML can be applied in a great number of ways. One such way could be content moderation, especially detecting people who use alternate accounts to reply to their own content or manipulate votes etc.

By including IP addresses with the comments they could correlate who said what where and better learn how to detect similar posting styles despite deliberate attempts to appear to be someone else.

It's a legitimate use case. Not sure about the legality... but I doubt google or reddit would ever acknowledge what data is included unless they believed liability was minimal. So far they haven't acknowledged anything beyond the deal existing afaik.

[–] [email protected] 1 points 10 months ago

Yeh, but its such a grey area.
If the result was for security only, potentially could be passable as "essential" processing.
But, considering the scope of content posted on reddit (under 18s, details of medical (even criminal) content) it becomes significantly harder to justify the processing of that data alongside PII (or equivalent).
Especlially since its a change of terms & service agreements (passing data to 3rd party processors)

If security moderation is what they want in exchange for the data (and money), its more likely that reddit would include one-way anonymised PII (ie IP addresses that are hashed), so only reddit can recover/confirm ip addresses against the model.
Because, if they arent... Then they (and google) are gonna get FUCKED in EU courts