this post was submitted on 23 Nov 2023
123 points (100.0% liked)

Technology

37687 readers
413 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
all 50 comments
sorted by: hot top controversial new old
[–] [email protected] 68 points 11 months ago (4 children)

It is very important for us to "NOT" support any films, shows, games, and whatever form of art and entertainment that tries to use AI to replace human creators.

Greed is going to tear apart every part of society unless we stop it.

[–] [email protected] 18 points 11 months ago (1 children)

What if I make my own videos with actors and/or voices are entirely imaginary? I don't have the resources to hire a videographer let alone an actor but I can write a script and use AI (and program/script things).

If I make something cool it'd be sad if no one watched it just because it didn't use real human actors and voices.

[–] [email protected] 17 points 11 months ago (1 children)

As long as you don't clone someone's voice without permission, then you didn't replace any actors imo. There's a big difference between an indie production using this tech to reach higher than they could otherwise and Disney just not wanting to pay wages.

[–] [email protected] 3 points 11 months ago* (last edited 11 months ago) (1 children)

I would argue that you could still use the voice/face of real people, as long as you add a disclaimer that it's not real people. Obviously big producers will use AI to cut costs. But the problem does not inherently lie in real actors vs AI. It lies in our capitalistic system not caring about people.

[–] [email protected] 1 points 11 months ago

South Park has some crazy lawsuits coming, if that person has their way.

I legitimately couldn’t tell they didn’t use the real Megan Markel!

[–] [email protected] 9 points 11 months ago (2 children)

If it's something for profit by some massive multi billion dollar company, definitely. That's just pure greed. But some tiny indie game dev or YouTuber using ai voiceovers is quite a bit more acceptable imo.

[–] [email protected] 2 points 11 months ago

Agreed. Butterbee's explanation in this thread is much better than my word.

[–] [email protected] 2 points 11 months ago

Is there a specific dollar amount where a creator isn’t allowed to create the way they want anymore?

[–] [email protected] 6 points 11 months ago

Lets let Bruce willis' family or val kilmer decide for themselves whether they can work a deal that lets us see their acting or likeness again and they get proper compensation. I'm not going to boycott a production with proper paperwork in place just because they use AI to make fremen eyes blue in a programmatic fashion.

[–] [email protected] 49 points 11 months ago (1 children)

Article seems to miss the point, it’s not that some celebs are unhappy like Attenborough and others are fine with it and participating in its development. It’s that some celebs are in control of how they are being used and represented and some are not. We are entering a period where we all need stronger legal protections, to ensure that we remain in control of what makes each of us unique, whether that’s DNA or a copy of our voice

[–] [email protected] 4 points 11 months ago (1 children)

DNA is highly likely to be unique, not guaranteed to be so.

This is a terrible idea. No one owns DNA or genes, and we already have problems with shitty company’s trying to patent or copyright genes we all already have. It’s bullshit that only benefits those at the top, and prevents others from getting there by restricting their rights.

Voices are the same. You can’t complain about an impressionist imitating you because you don’t like it, that childish nonsense. Everything we do is in some way a copy and recreation of what other people have done. AI just automated that process and people are upset it’s harder to rent seek and gate keep things that never belonged to them in the first place.

Seriously, the future you’re imagining has twins sueing each other for rights to their unique “identity”. It’s dumb as hell.

[–] [email protected] 10 points 11 months ago (2 children)

Sorry man but control over your own image, voice etc is not dumb, why should giant corporations be allowed to replicate and use it to make money without your consent. The fact that it can be so easily automated now just makes it worse. I’m not talking about copyright though just control and privacy. With regards to DNA I’m saying the exact opposite and there we probably agree, companies should absolutely not be able to patent or copyright genes. They shouldn’t be able to use them at all without explicit consent.

[–] [email protected] 2 points 11 months ago* (last edited 11 months ago) (1 children)

I agree with you 100% but it’s hard to convert others.

Same with art - AI mimicking your art after looking at all art is no different than a human looking at all art and having a style that mimics their favorites.

Same with imitation of face and voice.

[–] [email protected] 4 points 11 months ago* (last edited 11 months ago) (2 children)

Except that it is, categorically, different. AI doesn't "learn", it builds associations between data it samples. Incorporating data from the source itself is how these algorithms work, then they reproduce these pieces with permutations applied.

LLMs are easier to explain, so I'll use one as an example. The idea is that if you use particular words in order, that exact ordering of words is given higher weight in the model by a linear association between those words following each other in sequence. When you ask an LLM to "write like Author X", it can do so (partially) by pulling the weights it generated from that authors' works.

This is fundamentally different from how our brains learn and function. We can't hold databases of billions of pieces of information in our heads and compare them all in real time. It's not really comparable at all except as an inaccurate metaphor.

Edit: Too many replies to respond to them all, but our brains don't do linear algebra on matrices with billions of elements. Our brains work in fundamentally different ways. Conflating the two is a gross oversimplification and is incorrect. That was my entire point.

[–] [email protected] 3 points 11 months ago (1 children)

You’re complaining about scale, and pretending it’s a fundamental difference. It’s not

You have a severe misunderstanding of how your own brain works and why we call them neural nets in the first place if you think otherwise.

[–] [email protected] 2 points 11 months ago* (last edited 11 months ago) (1 children)

That is also exactly how I see it. Do you think the negative view is due to some primal jealousy? I don’t know how else to describe not liking something/someone because it’s better than you.

Perhaps it is from a viewpoint of sports, where performance enhancing drugs are frowned upon.

We don’t get mad at calculators anymore, but we did at one point. There was quite a large movement to ban them in schools. Isn’t this a similar thing on the “creative” side?

[–] [email protected] 3 points 11 months ago

I do think it’s the jealousy, the fear of being replaced, but also the pride of thinking of ourselves as somehow special and important.

We’re not.

We’re dumb fucking monkeys who learned to sometimes not be so dumb, and then a bunch of us forgot we were pretending.

The real lesson in ai is not that they’re getting super complex or sophisticated, but more us realizing the limitations of our own cognition, and hopefully finding ways to extend it.

You’re spot on about calculators. It’s really just that our schools and schoolteachers are unable to evolve, just like with the insistence that cursive is still a needed skill. Hopefully it won’t take a generation or more to update the educators mindset to taking advantage of the tools available, instead of shunning them.

[–] [email protected] 1 points 11 months ago

You’re right. It behaves exactly like we do. And yes, it is at a much grander scale.

Is something ethically, legally, or morally wrong with a computer that does what we do, but does it better?

[–] [email protected] 2 points 11 months ago* (last edited 11 months ago) (1 children)

why should giant corporations be allowed to replicate and use it to make money without your consent

Because it wasn’t yours to begin with

Biometrics belong more to humanity than any individual person

Your argument would make it so even facial recognition would be illegal, because they scan and use your facial info without consent

Same with drivers license databases

You can’t say “this particular use of this existing practice bothers me, everyone else needs to change now so I feel better”

Rules on these things need to be consistent, and if they shouldn’t be allowed to use unique information that you consider yours without your consent you’ve just eliminated advertising, security checkpoints, drivers license pictures, filming cops, and a million other things both good and bad that all rely on using your likeness without your consent.

[–] [email protected] 4 points 11 months ago (1 children)

On that we completely disagree, I would argue they are some of the only things that are intrinsically yours.

[–] [email protected] 2 points 11 months ago* (last edited 11 months ago)

So police can’t keep images of criminals faces (recreated and distributed to every cops computer in the nation through computer automation and often ai) without their consent?

And a private company can’t set that up and sell it to cops for profit?

Because I have some terribly bad news for you…

[–] [email protected] 26 points 11 months ago (1 children)

Maybe a cynical outlook, but 'AI' becoming such a big deal is only going to serve as a means to take out the human element. Why have a person narrate or write your nature show when you can have 'AI' mimic a known quantity.

[–] [email protected] 7 points 11 months ago (3 children)

I dont see how its a bad thing. Its basically multiplicating the amount of better narrators in that example.

Why having a shitty narrating voice when you can have an awarded one?

The only thing is the compensation to the originator and the labelling of whats real and generated.

But thats a minor issue IMO.

[–] [email protected] 33 points 11 months ago (1 children)

Whether or not you want someone to use your likeness isn't necessarily just a matter of money. You can't just wave dollar bills at any objection and assume everything's going to be okay. Some things are more important than a few bucks.

[–] [email protected] 5 points 11 months ago (1 children)

Sure. I fully agree with you.

But nonetheless its how technology works. Make something accessible to everyone (at least in digital technology)

Lets compare it to how davinci would have though about the possibility of photocopying the mona lisa and bring the art into every household.

Making him more fameous more than he could ever be by simply having one original picture in the louvre.

I think this example can be done with any abritrary skill and digital modelling.

Lets think ahead. A tennis player and his movements are used to train and create a robot which acts as a tennis teacher for tennis amateurs. It would also benefit the sport in general.

[–] [email protected] 2 points 11 months ago* (last edited 11 months ago) (1 children)

I mean, in the instance of legal use of a likeness outside of maybe some fair use cases, the technology doesn't necessarily dictate that its own use is legitimate in all use cases. Some people independently training a model for private use may be harder or impossible to do anything about, but there's definitely precedent for going after someone for profiting from your likeness without your consent.

There may be some grounds where the sort of fair use that parody enjoys could apply to AI or the use of AI-derived likenesses, but I wouldn't expect people's rights to their own likeness to evaporate overnight unless copyright goes with them in some broader sense.

The current controversy within SAG over whether to sign even a deal on a per-project basis for scanning actors seems like a pretty good indicator that the standards on this are far from ironed out.

When it comes to training models, I do think it's unrealistic to limit the use of materials that are readily and legitimately available on the internet for free. But straight up using AI to copy a likeness for profit is very different.

[–] [email protected] 1 points 11 months ago

Hmm. I think the discussion turning around copyright and fair use is somehow the bedrock of this.

You are right. Since we cannot even find a solution to work for nowadays breaches of copyright, it will probably be still problematic in future cases as well.

But I also see the chance to get rid of something on the way.

As we know what does not work, like copyright law execution and the uphill battle of forcing it, we can truly think outside the box.

i do not want to take sides on certain technologies for now as I never truly looked into such special case, but I could think of some kind of ownership verificatiom mechanism probably backed by cryptocurrency even nfts.

I do not expect for people to pay in full compensation for skills (capitalism shows us on youtube how some ecosystem is formed) but I am confident that the market will nonetheless finds some solution. We will get more of everything, this means trash, and this means copythefted content as well but summing up the content will be better, and skill will find a way to sustain and be unique on its own.

Anyway I am drifting off. I see many similarities in piracy discussions here in certain comminities. Because if it can be done, it will be done, and I see no choice but making the best out of it during the way.

[–] [email protected] 24 points 11 months ago (1 children)

It's a bad thing because Attenborough's vioce isn't just his voice. He's not lauded because of his vocal prowess - it's because of his knowledge of the subject and the fact that if he says something - even read from a script - his professional reputation means that he would question material that doesn't pass his sniff test.

Whatever people say - it is this reputation that people are exploiting, not his vowel sounds.

[–] [email protected] 5 points 11 months ago

Reputation is an interesting point.

But as I said in another post. That is no issue of AI itself.

We need anyway a verification of validity for anything in the near future.

Therefore it is already late for thinking about that. In the modern world nothing is valid until proven to be.

[–] [email protected] 2 points 11 months ago (1 children)

I saw a video the other day about how the movie culture has shifted to extensively using greenscreens instead of real world locations. And then just editing everything in afterwards, doing all the cuts in the studio etc. This obviously has altered how big movie productions are made and I imagine shifting to AI instead of real actors would exacerbate this trend by a lot. To me, big movie productions already feel lifeless and boring (most of the heavy lifting solely done by reputation the cast or director). I guess this will get worse. But then, I'm also curious what crazy ideas indie producers come up with.

[–] [email protected] 1 points 11 months ago (1 children)

Indeed. And I am more curious about whats possible instead of missing the good old days.

Nobody can argue that a 1950 movie is better as a marvel multi million production.

Even watching movies from the late 90s -2000s is a time travel and I assume most of the "good old movies" is of nostalgic origin

However nothing that cant be done by modern movies.

lifeless and boring

Do you have an example for that?

To me modern productions are putting so much effort in side story/side character building that it gets complex.

Also that nowadays a good movie lasts 3 hours. Instead of the good old 90 minutes.

[–] [email protected] 1 points 11 months ago

I strongly disagree with most of what you said. Especially Marvel productions to me seem to be terrible movies, or as I said, lifeless and boring. That was the whole point of my comment, i.e. that more and better technology won't automatically make movies better, but that you need a certain creative element in there (that so far, only humans can come up with). Big productions already lost most of this human, creative touch. Most characters/stories are pretty one-dimensional, acting is boring because actors are mainly chosen by reputation, etc. My last sentence referred to indie productions then combining this new technology with a more creative approach.

[–] [email protected] 8 points 11 months ago

I'd imagine the situation is more dire to those filmmakers and journalists that do narration as a job

Why would you need them if you can just use the voice of an AI impersonating famous speakers like this

[–] [email protected] 8 points 11 months ago (1 children)

I see this as a good thing nature docs wouldn't be as good without his voice and I'm sure AI could be trained to give a very similar set of opinions.

Yeah it's a bit morbid and he should definitely get paid for it.

[–] [email protected] 3 points 11 months ago (1 children)

he should definitely get paid for it.

Playing devil's advocate a little bit here - are you saying a person's voice is or should be copyrightable? Because it wouldn't be his voice, it's an imitation of his voice, it's an impression.

I'm just not sure this is an area that copyright law needs to be extended in to. I can see a requirement to disclose that it's AI generated being a good idea, but the idea that the likeness of somebody's voice is proprietary I think opens up a much worse can of worms.

[–] [email protected] 3 points 11 months ago (3 children)

The ai is trained on recordings of his voice which they have not secured the rights to though. You can't simply use any data you find on the street and use it professionally in any field.

An impression is a very different context. You're vastly overestimating the independence of an AI model to equate it to human performance or impersonators.

[–] [email protected] 5 points 11 months ago

You can’t simply use any data you find on the street and use it professionally in any field.

I kind of think you should be able to though, copyright laws are already much too strong and outdated with current technology, instead of strengthening them further I think we need to go back to first principles and consider why we need to have permission to record and relay what we see and hear.

[–] [email protected] 3 points 11 months ago* (last edited 11 months ago)

The ai is trained on recordings of his voice which they have not secured the rights to though.

What rights are they securing? Copyright prevents distributing copies. It doesn't prevent listening to recordings.