this post was submitted on 26 Jan 2024
167 points (100.0% liked)
Technology
37713 readers
213 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Wow this is going to be interesting from multiple fronts for me especially.
First, I'm a huge swiftie - and Taylor is probably not going to take this lightly. Who she's going to target will be a more interesting question. (Shameless plug for [email protected] if you want to join our small community)
Second, as a nerd who has dabbled with generated art - thank you trolls for ruining it for all of us. This is just going to beg for regulations that is going to ruin the generative AI world - as if we didn't have enough regulations barreling towards the area with copyright issues.
Third, as someone who hates Musk - I hope everything focuses on him and the platform formerly known as Twitter.
Awesome.
Is that hatred, or fear, that I hear in this comment?
That's "suppressing theft masquerading as art is awesome" you hear in that comment.
Ah, it was the third option, ignorance.
Oh, I'm not at all ignorant of how horrible generative " art " is, but I appreciate you checking on me.
If it's horrible and it's also "masquerading" as human art, what does that say about human art?
Are you mad at people who can draw or something?
No, I'm just pointing out the common contradiction I see in threads like this, where people argue that AI is both a big threat to "traditional" artists and also that AI is terrible compared to "traditional" artists. It can't really be both.
The use of "horrible" in their comment isn't necessarily about the quality of the art. Judging from context it's probably more about the ethical considerations. So not really a contradiction.
He put quotes around the word "art", which gives me the opposite impression.
I just notice alot of cheerleaders for this " art " form come from a place of vindictiveness against people with artistic talent and their positions are rooted more in a desire to see people the view as gatekeepers receive comeuppance than an honest defense of an ostensive tool.
It totally can. Take the example of fast food. Simultaneously a threat to traditional cooking and terrible.
And yet there's still plenty of traditional restaurants.
Fast food provides a new option. It hasn't destroyed the old. And "terrible" is, once again, in the eye of the beholder - some people like it just fine.
Fast food damages the health of society and impoverishes communities.
Unhealthy things should be forbidden? Even if they were, this is drifting off of the subject of AI art.
Things that are bad for society should be suppressed and things which are good for society should be promoted. That would seem to be the point of a society.
Further, I notice a pastern in your replies of bringing up metaphor then rejecting the very metaphor as off topic or irrelevant when it is engaged to it's logical conclusion.
No accusing you of engaging in bad faith or anything, but it smells (sorry, metaphor again) less-than-fresh.
Should we also have a single wise man to decide which is which? That has been tried before, multiple times.
Well we certainly shouldn't have violence for violence, as is the Rule of Beasts.
Is AI art literally violent, or is this another analogy?
Great, now we just need to establish whether AI art is "bad for society", and if it is then whether the effects of attempting to ban it would be worse for society.
What metaphors did I bring up? You're the one who brought fast food into this. I don't see any other metaphors in play.
That seems fairly evident
You were fine engaging fastfood until I pointed out it, like AI " art " was terrible. Only then did you deride the metaphor as off topic.
Hardly. There wouldn't be much debate about it if it was, would there?
Alright, in future I will try to remember to immediately reject any metaphors you bring into play rather than attempt to engage with them.
Sure there can be. People debate crypto being good and that's roundly recognized as ecocide. People "debate" who counts as people all the time. People can be wrong and loud.
Not saying you have to do that, but if you don't it's rather untoward to bring it up later as though it's a problem.
Ethereum switched to proof-of-stake a year and a half ago, it no longer has a significant environmental impact.
Oh wait, this is an analogy, isn't it?
So you're into the other tech scam too, are ya?
Fancy that.
No, just pointing out who's in the "loud but wrong" camp on that one. If ecological concerns are why you think crypto is bad, well, that's not clear cut any more.
You want to keep going with this analogy you brought up, then?
I'm not the one who claimed they were off topic. I'm the one who was right about generative " art " being a god-damn scam. Easy mistake to make I suppose.
All analogies eventually fail when you dig into them far enough, by nature of what an analogy is. That is, an analogy is not exactly identical to the thing being analogized. If you want to be able to use analogies but refuse to acknowledge that they eventually lose relevance when you stretch them too far then you're simply not amenable to reason.
And then you go and explicitly beg the very question under debate with an "of course I'm right." No, AI art isn't a "scam," whatever you mean by that.
Oh buddy come on you can't actually be misunderstanding how they used "horrible." They're not saying it's bad quality they're saying it's bad morally
You realize how a word like that can have ambiguous meanings, yes?
Emphasis mine. The context clues make the intended meaning pretty obvious
Misunderstanding doesn’t make the comment into the type of gotcha you think it is
I just wish my printer could actually print a car. 200mm bed is a little small
Break it down into chunks and assemble it like Lego.
Now you're stealing from LEGO™! 🙀
I have an honest question and would like to hear your (and others, of course) opinion:
I get the anger at the models that exist today. DallE, Midjourney and others were trained on millions of images scraped without consent. That itself is legally ambiguous, and will be interesting how courts rule on it (who am I kidding, they'll go with the corporations). More importantly though, some of it (and increasingly more, as the controversy reached mainstream) was explicitly disallowed by the author to be used as training data. While I don't think stealing is the right term here, it is without question unethical and should not be tolerated. While I don't feel as strongly about this as many others do, maybe because I'm not reliant on earning money from my art, I fully agree that this is scummy and should be outlawed.
What I don't understand is how many people condemn all of generative AI. For me the issue seems to be one of consent and compensation, and ultimately of capitalism.
Would you be okay with generative AI whose training data was vetted to be acquired consentually?
I don’t have a problem with training on copyrighted content provided 1) a person could access that content and use it as the basis of their own art and 2) the derived work would also not infringe on copyright. In other words, if the training data is available for a person to learn from and if a person could make the same content an AI would and it be allowed, then AI should be allowed to do the same. AI should not (as an example) be allowed to simply reproduce a bit-for-bit copy of its training data (provided it wasn’t something trivial that would not be protected under copyright anyway). The same is true for a person. Now, this leaves some protections in place such as: if a person made content and released it to a private audience which are not permitted to redistribute it, then an AI would only be allowed to train off it if it obtained that content with permission in the first place, just like a person. Obtaining it through a third party would not be allowed as that third party did not have permission to redistribute. This means that an AI should not be allowed to use work unless it at minimum had licence to view the work. I don’t think you should be able to restrict your work from being used as training data beyond disallowing viewing entirely though.
I’m open to arguments against this though. My general concern is copyright already allows for substantial restrictions on how you use a work that seem unfair, such as Microsoft disallowing the use of Windows Home and Pro on headless machines/as servers.
With all this said, I think we need to be ready to support those who lose their jobs from this. Losing your job should never be a game over scenario (loss of housing, medical, housing loans, potentially car loans provided you didn’t buy something like a mansion or luxury car).
Not if it was used to undercut human artists' livelihoods.
Hypothetical future where everybody gets UBI and/or AI becomes sentient and able to unionize, maybe we look back at this again.
I don't think AI has a soul but there no reason it couldn't be given one.
Undercutting artists' livelihoods is definitely a problem that needs to be addressed. I honestly don't think UBI is going far enough, as it's just a bandaid on the festering tumor of capitalism (but that's a discussion for another day). But can't the same be said about numerous other fields? AI can perform many tasks throughout all fields of work. At the moment it is still worse than an expert in most of these, but it's a matter of when, not if, it surpasses that. Engineers, programmers, journalists, accountants, I can't think of any job that is not en route to be streamlined or automated by AI, reducing need for humans and putting people out of work.
Artists have it worse in the sense that they are often self employed, which makes them more vulnerable to exploitation and poverty. But isn't the problem much larger than that?
This whole debate somewhat reminds me of the swing riots. They were often portrait as anti-technology or backwards, when in actuality the reason for the revolts wasn't that machines existed, but that they were used to undercut and exploit workers.
I'm not trying to argue that any of what's happening now is good, just to clarify again. The current "AI revolution" is rotten through and though. But AI is (for now, the consciousness question is super interesting, but not all that relevant at the moment) just a tool. It irks me that so much righteous anger is projected at AI, instead of the people using it to exploit people and maximize their profits, and the system that gives them the power to do so. Capitalists don't care if it's an AI, sweatshop workers overseas or exploited workers competing for jobs domestically. They'll go with that earns them more money. We should be angry at the cause, not the symptoms.
I'm curious what you mean by soul here, if you're using it in a metaphorical sense or the religious sense
There's a difference?
My initial position was that AI art would be exciting when a more carefully curated training data is used. ... But after some talking with friends, I think we're living in a world that has minimal respect for copyright already, except when a corporation has a problem with it and wants to bring down the hammer of the law.
It does hurt and its easy to be emotional about artists' livelihoods being threatened by AI, they aren't the only laborers threatened by job loss to automation, but this one hurts the most.
So now its just up to AI and artists to make interesting art with it. And for artists to adapt to this environment that has automated art tools.
omg Franzia haii :3
with how easy it is to run these models by now, the technology is certainly here to stay, and people will need to adapt, for sure. It only really makes sense to discuss AI in the broader context of capitalism, imo
In doesn't matter. Sophisticated models are open-source and have already been forked and archived beyond all conceivable hope of regulation. There's no going back.
We’ll just see about that.
Are you going to somehow reach into my personal computer and remove the software and models from it?
Could be. My tines are ever dangling.
Neuromorphic hardware is coming to some future gen phones to allow training custom sophisticated models.
Indeed we'll see... the rest of the iceberg.
One can only hope! Fingers crossed!!!