this post was submitted on 29 Jan 2025
44 points (92.3% liked)
Asklemmy
44618 readers
922 users here now
A loosely moderated place to ask open-ended questions
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yes it is, simply due to the nature of the "training"/"learning" process, which is learning in name alone. If you know how this mathematical process works you know the machine's definition of success is how well it's output matches the data it was trained with. The machine is effectively trying to encrypt it's data base on it's nodes. I would recommend you inform yourself on how the "training" process actually works, down to the mathematical level.
AI is often push by the same people who pushed NFTs and whatnot, so this is somewhat understandable. And yes, AI consumes a lot of energy and water. Maybe not as much as crypto, but still, not something we can afford to use for mindless entertainment in our current climate catastrophe.
Yup. AI "art" works by finding pixel patterns that repeat with a given token. Due to it's nature, it can only repeat patterns which it identified in it's training data. Now, we have all heard of the saying "An image in worth a thousand words". This saying is quite the understatement. For one to describe an image down to the last detail, such detail that someone who never saw the image could perfectly replicate it, one how need more than a thousand words, as evidenced by computer image files, since these are basically what was just described. The training data never has enough detail to describe the whole image in such detail and therefore it is incapable of doing anything too specific.
Art is very personal, the more of yourself you put into a piece, the more unique and "soulful" it will be. The more of the work you delegate to the machine, the less of yourself you can put into the piece, and if 100% of the image generation was made by the machine, which is in turn simply calculating an average image that matches the prompt, then nothing of you is in the piece. It is nothing more than the maths that created it.
Simple text descriptions do not give the human meaningful control over the final piece, and that is why pretty much any artist worth their tittle is not using it.
Also, the irony that we are automating the arts, something which people enjoy doing, instead of the soul degrading jobs nobody wants to do, should not be lost on us.
It is true that AI is being used in horrible was that will take sometime to adapt, it is simply that the negative usages of AI have more visibility than the positive usages. As a matter of fact, this node network technology was already in use in many fields before the Chat-GPT induced AI hype train.
Correct. It is well known that those who stem to financially benefit from the success of AI are more than willing to lie about it's true capabilities.