this post was submitted on 30 Sep 2023
1093 points (98.8% liked)

Open Source

31122 readers
285 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 16 points 1 year ago (2 children)

Using copyrighted material for research is fair use. Any model produced by such research is not itself a derivative work of the training material. If people use it to create infringing (on the training or other material) they can be prosecuted in the exact same way they would if they created an infringing work via Photoshop or any other program. The same goes for other illegal uses such as creating harmful depictions of real people.

Accepting any expansion of IP rights, for whatever reason, would in fact be against the ethics of free software.

[–] [email protected] 1 points 1 year ago (1 children)

Using copyrighted material for research is fair use. Any model produced by such research is not itself a derivative work of the training material.

You're conflating AI research and the AI business. Training an AI is not "research" in a general sense, especially in the context of an AI that can be used to create assets for commercial applications.

[–] [email protected] 2 points 1 year ago

It's not possible to research AI without training them.

It's probably also not possible to train a model whose creations cannot be used for commercial applications.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

Yet people are sueing because it can summarize their works

[–] [email protected] 4 points 1 year ago (1 children)

That's ridiculous as even summaries themselves are protected. You can find book summaries all across the web (say wikipedia).

[–] [email protected] 2 points 1 year ago

I agree, but that doesn't stop people.