this post was submitted on 22 Sep 2023
25 points (66.2% liked)

Technology

59424 readers
3168 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago (2 children)

The amount of training data needed for a model is so huge that you'd have to use only artwork that was preemptively licensed for that purpose. Individually asking artists for permission to use their work would be far too expensive even if they all agreed to let you use their work for free.

[–] [email protected] 4 points 1 year ago

And why is that a problem?

Artists should have control over their work. It’s not my problem that a big company vs a small company is stealing my work, I don’t want either of them to.

I no longer post anything online that I create cause I’d rather nobody see it than it be stolen for AI training.

[–] [email protected] 2 points 1 year ago (1 children)

That is correct, though there could be campaigns to collect art otherwise. There are plenty of artists in the open source world who could do it, and asking individuals to signal boost these calls to action can get more push. Once more, no matter what, big corps will always have more monitary resources. The power of open source is volunteer manpower and passion. Even if these weren't the case, the moral argument still stands in using a persons work to replace them without permission.

Regardless of that even, what this will do is cause stagnation in the art field if not protected. Nobodies going to share their art, their method, or their ideas freely in a world where doing so allows a massive corp to take it freely without permission, thus replacing them. This kills ideas of open distribution of art and art information. It will become hidden, and new ideas, new art, will not be available to view.

Allowing people to take without permission will only ever hurt the small artists. Disney will always be able to just "take" any art they make.

Also, you're not entirely correct on that. Models made for specific purposes don't actually need the absurd amount generalist models need. However in the context of current expectations yeah, you're right on quantity.

[–] [email protected] 1 points 1 year ago (2 children)

Massive corps don't need to use the output of "little artists", they have their own massive repositories of works they own or license that they can train AIs on.

The small artists won't be able to use those AIs, though. Those AIs will belong to Disney or Getty Images, and if they deign to allow others to use them it'll be through paywalls and filters and onerous licensing terms. The small artists would only be able to use open models freely.

This insistence on AIs being prohibited from learning from otherwise public images is going to be a phyrric victory if it ever comes to pass.

[–] [email protected] 3 points 1 year ago (1 children)

Why do they do it now then? They do need this. They need absurd amounts of tagged images of varying quality and style. No, their own repositories are nowhere near enough for general models. They require the small artists. Many artists, small or large, will simply refuse to license to disney too.

Allowing them to take from the smaller artists does not help the situation either. They now simply have more data, which they can run through their better equiped systems, quicker than anyone else can do. This helps the big corps while doing little for us small devs.

On the matter of these being "otherwise public images" being what they are trained on, can you not see this destroying this large public repository of information? No new work made by people who have unique ideas will be made public. Why would they? if they do, disney and getty images can now out compete them. This will cause the currently massive resource of images, information, and general art to become hidden. To become no-longer public. This stagnates art where it is now. Only that which people are OK with AI taking will be shared, becouse it will be. We get the same outcome either way, save for that already shared, the only difference is that nobody is able to enjoy the art being made which the artists don't want training AI.

[–] [email protected] 2 points 1 year ago (1 children)

It's convenient to be able to use whatever publicly available images you want for training, but it's not necessary. Adobe proved this with their Firefly AI.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

Their text to image is nowhere near the abilities of other tools, and the rest are specialized tools.

It's convenient, yes, but without it these models are much more limited.

Even of it was, my other points which you've ignored still stand

[–] [email protected] 1 points 1 year ago (1 children)

It’ll be a massive victory for artists and a failure for all the sham AI prompt generators.

There’s not a single downside to requiring all material used in training to be licensed.

[–] [email protected] 1 points 1 year ago (1 children)

There’s not a single downside to requiring all material used in training to be licensed.

It destroys the open source/hobbyist sector. The only AIs that would be available for artists to use would be corporate-controlled, paywalled, and filtered. That's a pretty huge downside.

[–] [email protected] 1 points 1 year ago (1 children)

That’s not my problem

Art is not generated by machines. Nothing of value is lost.

[–] [email protected] 1 points 1 year ago (1 children)

Ah, so you meant "there's not a single downside to me."

[–] [email protected] 0 points 1 year ago (1 children)

Nothing of value is lost. Generative AI does not create anything new.

It’s exclusively a benefit to artists

[–] [email protected] 1 points 1 year ago (1 children)

Nothing of value to you is lost. We already know you don't care about other people, no need to keep repeating that.

[–] [email protected] 1 points 1 year ago (1 children)

AI does not generate anything of value

I care about artists and the protection of their work. Not the AI models or their creators.

[–] [email protected] 2 points 1 year ago (1 children)

There are artists who use AI tools as part of their workflow. You don't care about them.

[–] [email protected] 0 points 1 year ago

And they can allow their art to be used to train AI. It just shouldn’t come at the expense of everyone else who wants to do things the traditional way.

Why should my work be open for anyone to use to train AI? I don’t care if it’s a hobbyist or an open model or google. I don’t want them using my work for training their models. Artists currently have rights over their work being used commercially, and I expect AI arguments to go that way as well. If it is to be used then it must be with the permission of the creator and a licensing contract written out. Art can be shared license free or AI permissive licenses, but would not be required to be.