this post was submitted on 02 Jan 2025
16 points (71.1% liked)

Asklemmy

44331 readers
880 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_[email protected]~

founded 5 years ago
MODERATORS
 

First and foremost, this is not about AI/ML research, only about usage in generating content that you would potentially consume.

I personally won't mind automated content if/when that reach current human generated content quality. Some of them probably even achievable not in very distant future, such as narrating audiobook (though it is nowhere near human quality right now). Or partially automating music/graphics (using gen AI) which we kind of accepted now. We don't complain about low effort minimal or AI generated thumbnail or stock photo, we usually do not care about artistic value of these either. But I'm highly skeptical that something of creative or insightful nature could be produced anytime soon and we have already developed good filter of slops in our brain just by dwelling on the 'net.

So what do you guys think?

Edit: Originally I made this question thinking only about quality aspect, but many responses do consider the ethical side as well. Cool :).

We had the derivative work model of many to one intellectual works (such as a DJ playing a collection of musics by other artists) that had a practical credit and compensation mechanism. With gen AI trained on unethically (and often illegally) sourced data we don't know what produce what and there's no practical way to credit or compensate the original authors.

So maybe reframe the question by saying if it is used non commercially or via some fair use mechanism, would you still reject content regardless of quality because it is AI generated? Or where is the boundary for that?

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 36 points 1 week ago (1 children)
[–] [email protected] 28 points 1 week ago (4 children)

Well, I'm a mod of [email protected], so...

The rise of what recently/popularly has been referred to as "AI" is a massive scam/bubble.

load more comments (4 replies)
[–] [email protected] 22 points 1 week ago (2 children)

There are two core issues I have with AI generated content:

  1. Ownership - All the big players are using proprietary software, weights, models, training methods, and datasets to generate these models. On top of the lack of visibility, they have farmed millions of peoples data and content without their knowledge or consent. If it were up to me, all AI research and software would be 100% open source, public access, non-copyright. That includes all theoretical work in scientific publications, all code, all the datasets, the weights, the infrastructure and training methods, absolutely everything.

  2. Lowest common denominator - AI has unleashed the ability for individuals and organizations to produce extremely low effort content at volumes that haven't been seen before. I hate how search results are becoming totally poisoned by AI slop. You just get pages and pages of sites that abuse SEO to become the top search result and are nothing more than click-farms to generate ad revenue. This is a systemic issue that stems from several things, primarily Capitalism, but also the way we cater to powerful corpos that push this sludge onto us.

I have no issue with AI tools that are actually helpful in their context. For instance, animation software that uses AI to help generate intermediate frames from your initial drawings. Screen reader software that uses AI to help sight-impaired folks with more accurate text-to-speech. AI tools that help with code completion, or debugging.

These are all legitimate uses of the technology, but sadly, all of that is being overshadowed by mountains of sludge being shoved on us at every level. Because those implementations aren't going to make rich people even richer, they aren't going to temp investors to dump billions more into AI startups and corpo tech. Helping blind people and indie animation studios is boring and low-profit, therefore in a Capitalist system, it gets shoved to the bottom of the stack while the high-margin slop gets pumped down our throats.

[–] [email protected] 5 points 1 week ago

Very well said. I think at the end of the day, the human element is too easy to overlook and that's a problem. We have one bot, a search engine, keeping an eye open for content. SEO wants to stand out for that bot, so it demands content (and in a certain way) be created so the search engine picks it up... But that takes effort, so we have another bot creating content to get the attention of another. And the thing a person wants just becomes an afterthought and dead Internet theory is that much more real

[–] [email protected] 4 points 1 week ago

About ownership, you didn't mention the risk of mass manipulation by perfectly filtering out any critique of social injustices that the training set had. Gen AI is a better brainwashing tool than corporate mass media.

(The day after the mass murderer CEO got shot, OkCupid (Match Group) let me know that they had deleted the year-old chapter in my profile containing "Fuck the healthcare system - make a better one", without sending me a copy to edit. The assholes have deleted so much of my content. 85% of my multiple-choice question answers deleted without a warning. Back up your online content, people!)

[–] [email protected] 19 points 1 week ago* (last edited 1 week ago) (3 children)

Peel back the veneer of AI and you find the foundation of stolen training data it's built on. They are stealing from the very content creators they aim to replace.

Torrent a movie? You can potentially go to jail. Scrape the entire internet for content and sell it as a shitty LLM or art generator? That's just an innovative AI startup which is doing soooooo much good for humanity.

[–] [email protected] 7 points 1 week ago (2 children)

Exactly, an equitable solution could be to pay royalties to artists that had their work stolen to train these algorithms. That, however, would require any of the generative algorithms to be operating at a profit, which they absolutely are not.

[–] [email protected] 7 points 1 week ago (1 children)

And it would require the LLM owners to admit to stealing that content.

[–] [email protected] 5 points 1 week ago (1 children)
[–] [email protected] 5 points 1 week ago* (last edited 1 week ago) (1 children)

One more thing: if you want to use public data, your AI needs to be open source (not just the software around it, the actual models that do the AI stuff needs to be available for anyone to run on their own system) and all the works generated with it public domain. The public owns your AI at that point. Personally, if you don't want to pay me, then let me have a stake in the AI my data helped create.

[–] [email protected] 3 points 1 week ago

That's a good point.

[–] [email protected] 1 points 1 week ago (1 children)
[–] [email protected] 2 points 1 week ago (1 children)

In this case it absolutely is.

load more comments (1 replies)
[–] [email protected] 3 points 1 week ago

Torrent a movie? You can potentially go to jail. ...

Because artists are not billion doller hollywood studios with so many political lobbies and stubborn well paid lawyers, duh.

[–] [email protected] 1 points 1 week ago

Even if they were able to train them without stealing, the threat they pose to our society would be equally problematic.

[–] [email protected] 15 points 1 week ago

I’d rather gouge out my eyes with a rusty spork.

[–] [email protected] 11 points 1 week ago (1 children)

It's just deeply inauthentic. I'd feel tricked if I listened to a song that I enjoyed and found out it was actually a meaningless machine printout.

[–] [email protected] 1 points 1 week ago (2 children)

Is there such an example? Till now I didn't come across any remotely good/interesting music generated by AI of any meaningful length. Short clips are kinda good but that needs creative composer to arrange them into music.

[–] [email protected] 11 points 1 week ago

As someone who has had her art stolen for usage in an AI output, AI generated images are considered a form of art theft for good reason.

[–] [email protected] 9 points 1 week ago (1 children)

I avoid AI content because it's sort of an intellectual goo. It looks like there were some thoughts behind it, smells like it, and then you notice the distorted letters or certain writing style patterns. The AI we have currently is not sentient, so if there are no humans in the loop doing quality control then you end up with an AI telling people to eat rocks while citing The Onion. I lose trust in anything when I spot that a part of it was AI generated - without being explicitly marked as such - for this reason

Then there's AI's heavy association with corporations/VCs/tech bros, giant waste of electricity, bias in the training data, legality and ethical implications of training AI on data from the entire internet, people losing jobs, companies running sweatshops of people in 3rd world countries to manually classify said data, the list goes on and on

[–] [email protected] 2 points 1 week ago

yup, currently whatever is called AI is not intelligent, they do not actually understand the prompts and data points that get fed into them, they merely know what is the most statistically relevant answer from the question. We may still be able to keep improving on the current LLMs, but we will very soon hit a wall that a mathematical model that is only trained on existing data cannot pass through.

[–] [email protected] 8 points 1 week ago

I think it’s a bad idea in general, currently being produced in unethical ways by people with unethical aims, consistently failing to deliver on a tenth of what was promised and already ruining a lot of stuff despite its frailty.

[–] [email protected] 8 points 1 week ago

I think stable diffusion is cool. 🤷‍♀️

[–] [email protected] 8 points 1 week ago (1 children)

I have no problem with it. I’ve been using it to make images for my website that I would otherwise not be able to afford to pay a graphic designer to make.

I also use it to help me figure out wording to get the right tone to my message. I’ll read a few iterations and then work off of the one that I like best. The AI one is not always better, but it’s great to get quick alternatives for comparison.

[–] [email protected] 2 points 1 week ago (1 children)

What would you say if your work was used in ai and no one would pay you for your work?

[–] [email protected] 1 points 1 week ago (1 children)

Is it really that different from me hiring a graphic designer and asking them to create art for me in a specific style. Even more so if I hiring someone from a country with low wages?

[–] [email protected] 1 points 1 week ago (1 children)

If you hire a graphic designer to create something for you, presumably you pay them.

With ai, someone took their creations and trained the ai to create images and didn't pay them.

So yeah there's a difference.

[–] [email protected] 2 points 1 week ago (2 children)

Either way, someone is getting paid to create something.

load more comments (2 replies)
[–] [email protected] 7 points 1 week ago (1 children)

Define the terms please. AI has existed for decades. What are you focusing on now?

[–] [email protected] 2 points 1 week ago (1 children)

I'm not talking about AI in general here. I know some form of AI has been out there for ages and ML definitely has some field specific usecases. Here the objective is to discuss the feeling about gen AI produced content in contrast to human made content, potentially pondering the hypothetical scenario that the gen AI infrastructure is used ethically. I hope the notion of generative AI is sort of clear, but it includes LLMs, photo (not computer vision) and audio generators and any multimodal combination of these.

[–] [email protected] 4 points 1 week ago

That's a good start, but where do you draw the line? If I use a template, is that AI? What if I am writing a letter based on that template and use a grammar checker to fix the grammar. Is that AI? And then I use the thesaurus to automatically beef up the vocabulary. Is that AI?

In other words, you can't say LLM and think it's a clear proposition. LLMs have been around and used for various things for quite a while, and some of those things don't feel unnatural.

So I'm afraid we still have a definitional problem. And I don't think it is easy to solve. There are so many interesting edge cases.

Let's consider an old one. Weather forecasting. Of course the forecasts are in a sense AI models. Or models, if you don't want to say AI. Doesn't matter. And then that information can be displayed in a table, automatically, on a website. That's a script, not really AI, but hey, you could argue the whole system now counts as AI. So then let's use an LLM to put it in paragraph form, the table is boring. I think Weather.com just did this recently and labeled it "AI forecast", in fact. But is this really an LLM being used in a new way? Is this actually harmful when it's essentially the same general process that we've had for decades? Of course it's benign. But it is LLM, technically...

[–] [email protected] 6 points 1 week ago

It's useful in some circumstances, but businesses are pushing for it in way too many areas. Luckily most have seen the light and that it is no where close to replacing humans. AI can't write a movie that will captivate audiences. (Hell I get bored with character chats after a few messages). AI can't animate a movie. It can't make a video game, or build useful programs.

What it can do it does well. Give you a jumping off point, give you different perspectives, allow you to get started - and I think we'll see it used in that area. For text based AI, it's great at something like "Give me 20 prompts" that can help a writer get started - but we all can tell AI generated content pretty quickly, and it gets dull.

So that's what makes me say it'll be useful in the second area, which is AI slop. Meta and them have discovered that there are a ton of gullible people out there who will happily consume AI slop left and right, roll right up to the trough and eat it down. It can't make a full feature length movie, but it can make a blog post on some half baked subject. We'll see a lot more of that.

I'll fight tooth and nail against it replacing jobs, or having full works from just AI out there. If you want to use it personally, go right ahead. I guess what I'm saying is that my moral compass around it is:

  • Generate whatever you like for personal use, who cares
  • For public consumption, AI should be used only to generate the "outline" of the content. If you call it done after that phase, it's slop, and it's immoral to publish it. If you want to take the outline and put your spin on it, and use it to build something new, then absolutely go for it.
[–] [email protected] 4 points 1 week ago

It's a perfect commodity, which means it's going to be worth the least out of anything out there.

[–] [email protected] 4 points 1 week ago* (last edited 1 week ago)

AI content is low-quality slop. That said, sometimes low-quality slop is the best option for what you want, and in that case, it can make sense to use. That slop can also make a useful ingredient for other, better works, so long as its just a small peice used appropriately.

[–] [email protected] 3 points 1 week ago

filters and reduce a lot of creativity

[–] [email protected] 2 points 1 week ago (1 children)

If it is for personal usage, I don’t mind and I don’t care. If it is just for putting on like an AI fan site.Where somebody created an image of a dragon sitting on top of a castle with knights running around, I don’t care I have no problem.

But if it’s used in movies and it is taking jobs away from people that I care. If it’s used in music and it is jobs away from people that I care. If it’s used in art or anything else, and it is taking jobs away from people then I care.

I don’t want to see computer created stuff. I wanna see what humans come up with. It’s also why in movies I prefer practical effects over special effects.

Companies will always go for the cheapest way to do something, but at some point, we’re not gonna have enough jobs. The company won’t care they’re still making money off of somebody.

When we went from horse and buggy a car, the people who made the horse and buggy could take their skills to go build a car because some of the ideas transferred over.

If we keep giving the jobs to AI , where people going to go for jobs?

I want to see what people created with their own hands. Not have a person just type some keywords into a computer and have the computer just generate something.

[–] [email protected] 1 points 1 week ago (2 children)

Wouldn't art created from personal use be taking away commissions from artists? I don't see how it's functionally any different. Only the scale is changed. If I wanted a very specific picture I could either generate it myself or get it commissioned. What makes that any difference for Hollywood? Either your paying for the software and someone to generate the content or your paying for the artists? What about CGI vs practical effects? It's all the same argument.

[–] [email protected] 4 points 1 week ago* (last edited 1 week ago)

this would go into the same argument against piracy though, most of the time people don't actually commission others for personal use stuffs, people tend to only commission stuff for things that are less personal and would be shared around. AI just happen to be a convenient option for that one use case.

[–] [email protected] 1 points 1 week ago

You have a good point

[–] [email protected] 2 points 1 week ago

Obvious trash

[–] [email protected] 1 points 1 week ago

Thoughts on AI-Generated Content

AI-generated content is a fascinating and rapidly evolving area that raises important questions about quality, creativity, and the role of technology in our lives. Here are some key points to consider regarding AI-generated content, particularly in the context of consumption:

Quality and Acceptance

  1. Current Capabilities: As you noted, AI has made significant strides in generating content that can sometimes match human quality, especially in areas like audiobooks, music, and graphics. While the technology is improving, there are still limitations, particularly in producing nuanced or deeply creative works.

  2. Consumer Acceptance: People often accept AI-generated content in contexts where the artistic value is less critical—like stock photos or simple graphics. This acceptance suggests that as long as the output meets a certain standard of utility or aesthetic appeal, consumers are willing to overlook the lack of human touch.

Creativity and Insight

  1. Limitations of AI: While AI can generate text, music, and visuals based on patterns learned from existing data, it struggles with true creativity and insight. Genuine creativity often involves emotional depth, personal experience, and cultural context—elements that AI currently cannot replicate.

  2. The Filter of Quality: As you mentioned, the internet has conditioned us to filter through a lot of low-quality content. This experience has heightened our ability to discern quality, making us more critical of automated outputs. The challenge for AI-generated content is to rise above this noise and provide something genuinely valuable.

Future Potential

  1. Collaborative Creation: One promising avenue for AI-generated content is its potential as a tool for human creators rather than a replacement. For instance, writers might use AI to brainstorm ideas or overcome writer's block, while musicians could use it to generate backing tracks or explore new styles.

  2. Evolving Standards: As technology progresses, our standards for AI-generated content may evolve as well. What seems inadequate today might be seen as acceptable or even impressive in the future as both creators and consumers adapt to new capabilities.

Conclusion

In summary, while there are valid concerns about the limitations of AI-generated content—especially regarding creativity and insight—there's also potential for it to enhance human creativity and fill specific niches effectively. As technology continues to advance, it will be interesting to see how our perceptions shift and how we integrate these tools into our creative processes. The key will be maintaining a balance between leveraging AI's capabilities while valuing the unique contributions that human creators bring to the table.

[–] [email protected] 1 points 1 week ago

I think it's pretty cool. A lot of the things people are doing with open weights models are incredible and free for everyone to use.

[–] [email protected] 1 points 1 week ago
[–] [email protected] 1 points 1 week ago

I like it as an idea flow starter. I've used it to generate stuff like site profile logos (like my little ghost in baseball cap here) and screen savers. I've used it for minor tasks like coding Excel macros and such.

But would I say it's a major life impactor? I'd have to say that even though it saves a little time here or there... no.

load more comments
view more: next ›