this post was submitted on 15 Mar 2024
492 points (95.4% liked)

Technology

59594 readers
3345 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 9 points 8 months ago (2 children)

This is what people fundamentally don't understand about intelligence, artificial or otherwise. People feel like their intelligence is 100% "theirs". While I certainly would advocate that a person owns their intelligence, It didn't spawn from nothing.

You're standing on the shoulders of everyone that came before you. You take a prehistoric man or an alien that hasn't had any of the same experiences you've had, they won't be able to function in this world. It's not because they are any dumber than you. It's because you absorbed the hive mind of the society you live in. Everyone's racing to slap their brand on stuff to copyright it to get ahead and carve out their space.

"No you can't tell that story, It's mine." "That art is so derivative."

But copyright was only meant to protect something for a short period in order to monetize it; to adapt the value of knowledge for our capital market. Our world can't grow if all knowledge is owned forever and isn't able to be used when even THINKING about new ideas.

ANY VERSION OF INTELLIGENCE YOU WOULD WANT TO INTERACT WITH MUST CONSUME OUR KNOWLEDGE AND PRODUCE TRANSFORMATIONS OF IT.

That's all you do.

Imagine how useless someone would be who'd never interacted with anything copyrighted, patented, or trademarked.

[–] [email protected] 1 points 8 months ago (1 children)

Yes, so how come all these arguments were not popular before the current hype about text generators?

Have some integrity.

[–] [email protected] 1 points 8 months ago (1 children)

They absolutely were, the entire time. You just didn't have interest in hearing about it aned weren't engaged on it.

Learn what integrity means if you want to use it as a snarky one liner.

Have some common sense.

[–] [email protected] 0 points 8 months ago

They absolutely were, the entire time. You just didn’t have interest in hearing about it aned weren’t engaged on it.

Why express your opinion on subjects where it's not worth anything?

You are saying these mutated cryptobros cared about copyright and patent laws being obsolete and harmful before "AI"?

Learn what integrity means if you want to use it as a snarky one liner.

I know what every word I use means

[–] [email protected] 0 points 8 months ago* (last edited 8 months ago) (1 children)

That's not a very agreeable take. Just get rid of patents and copyrights altogether and your point dissolves itself into nothing. The core difference being derivative works by humans can respect the right to privacy of original creators.

Deep learning bullshit software however will just regurgitate creator's contents, sometimes unrecognizable, but sometimes outright steal their likeness or individual style to create content that may be associated with the original creators.

what you are in effect doing, is likening learning from the ideas of others to a deep learning "AI" using images for creating revenge porn, to give a drastic example.

[–] [email protected] 2 points 8 months ago* (last edited 8 months ago)

Yes. Your last sentence is my point exactly. LLMs haven't replicated everything about the human brain. But the hype is here because it cracks one of our brains key features: How it learns. Your brain isn't magic. It just records training data until it has enough to mash it together into different things.

A child doesn't respect copyright, they'll draw a picture of Mario. You probably would too If I asked you to. Respecting copyright is something we learn to do in specific situations. This is called "coming up with an original idea". But that's bullshit. There are no original ideas.

If you come up with a product that's a cold brew cup that refrigerates its contents, I'd say that's a very original idea. But you didn't come up with refrigeration, you didn't come up with cups, or cold brew, or the idea of putting technology in a cup, or the concept of a product you sell to people. Name one thing about this idea that you didn't learn somewhere else? You can't. Because that's not how people work. A very real part of business, that you will learn as you put your new cup to market, is skirting around copyright. Somebody out there with a heated cup might come after you for example.

This is a difficult thing to learn the precise line on. Mostly because it can't work as a concrete rule. AI still has to be used, tested, and developed to learn the nuances here. And it will. But what baffles me is how my example above outlines how every process of invention has worked since the beginning of humanity. But if an LLM does it, people say, "That's not a real idea. It just took a bunch of stuff it's learned and mashed it together." But I hear, "My brain is 🪄magic✨ I'm special."