this post was submitted on 16 Oct 2023
95 points (91.3% liked)

Technology

59322 readers
4980 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Deepfake Porn Is Out of Control::New research shows the number of deepfake videos is skyrocketing—and the world's biggest search engines are funneling clicks to dozens of sites dedicated to the nonconsensual fakes.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 109 points 1 year ago* (last edited 1 year ago) (1 children)

From my perspective deep fakes will lead to a short but massive peak of harassment until everyone is aware of the technology and its capabilities. Once the technology reaches the mainstream and everyone is able to generate such content with ease, people will just stop caring. If these videos are everywhere, it's easy to play it off as a fake. It might even help victims of actual revenge porn. Virtual nudity will become less of a deal, probably even in real life.

From my perspective the bigger issue of deep fakes is news. We already have a huge issue with lies on social media and even TV and newspapers today and once we can no longer trust what we see it will be incredibly hard to build up trust for any sources.

Fake videos of politicians being spread to harm their credibility, fake videos of war crimes to justify an attack. Or vice versa if there's an authentic video of a crime the offenders will just deny the authenticity. But in contrast to Trump's "fake news" claims today, it will be more or less impossible for normal people to fake check anything.

[–] [email protected] 20 points 1 year ago (1 children)

Although not related to porn, a lot of scam services that operate in India already use it as a defense. Its extremely hard to get someone in the field in trouble because you need evidence to raid, and it cant be video nor audio because they claim that the said medium is a deepfake.