this post was submitted on 17 Mar 2024
88 points (91.5% liked)

Technology

34906 readers
276 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

There's an extraordinary amount of hype around "AI" right now, perhaps even greater than in past cycles, where we've seen an AI bubble about once per decade. This time, the focus is on generative systems, particularly LLMs and other tools designed to generate plausible outputs that either make people feel like the response is correct, or where the response is sufficient to fill in for domains where correctness doesn't matter.

But we can tell the traditional tech industry (the handful of giant tech companies, along with startups backed by the handful of most powerful venture capital firms) is in the midst of building another "Web3"-style froth bubble because they've again abandoned one of the core values of actual technology-based advancement: reason.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 13 points 8 months ago* (last edited 8 months ago) (3 children)

Today's AI is way worst then when ChatGPT was first released.... it is way too censored.

But either way, I never considered LLMs to be A.I. even if they have the possibility to be great.

[–] [email protected] 25 points 8 months ago* (last edited 8 months ago) (2 children)

Today's AI is way worst then when ChatGPT was first released.... it is way to censored.

You might need to reconsider that position. There are plenty of uncensored models available, that you can run on your local machine, that match or beat GPT-3 and beat the everliving shit out of GPT-2 and other older models. Just running them locally would have been unthinkable when GPT-3 released, let alone on CPU at reasonable speed. The fact that open source models do so well on such meager resources is pretty astounding.

I agree that it's not AGI though. There might be some "sparks" of AGI in there (as some researchers probably put it), but I don't think there's much evidence of self-awareness yet.

[–] [email protected] 5 points 8 months ago* (last edited 8 months ago) (1 children)

which one is your favorite one? I might buy some hardware to be able to run them soon (I only have a laptop right now that is not the greatest, but I am willing to upgrade)

[–] [email protected] 8 points 8 months ago* (last edited 8 months ago) (1 children)

You might not even need to upgrade. I personally use GPT4All and like it for the simplicity. What is your laptop spec like? There are models than can run on a Raspberry Pi (slowly, of course 😅) so you should be able to find something that'll work with what you've got.

I hate to link the orange site, but this tutorial is comprehensive and educational: https://www.reddit.com/r/LocalLLaMA/comments/16y95hk/a_starter_guide_for_playing_with_your_own_local_ai/

The author recommends KoboldCPP for older machines: https://github.com/LostRuins/koboldcpp/wiki#quick-start

I haven't used that myself because I can run OpenOrca and Mistral 7B models pretty comfortably on my GPU, but it seems like a fine place to start! Nothing stopping you from downloading other models as well, to compare performance. TheBloke on Huggingface is a great resource for finding new models. The Reddit guide will help you figure out which models are most likely to work on your hardware, but if you're not sure of something just ask 😊 Can't guarantee a quick response though, took me five years to respond to a YouTube comment once...

[–] [email protected] 2 points 8 months ago* (last edited 8 months ago) (1 children)

thanks a lot man, I will look into it but I have on-board gpu.... not a big deal if I need to upgrade (I spend more on hookers and blow weekly)

[–] [email protected] 2 points 8 months ago

It's ok if you don't have a discrete GPU, as long as you have at least 4GB of RAM you should be able to run some models.

I can't comment on your other activities, but I guess you could maybe find some efficiencies if you buy the blow in bulk to get wholesale discounts and then pay the hookers in blow. Let's spreadsheet that later.

[–] [email protected] 3 points 8 months ago* (last edited 8 months ago) (2 children)

AGI

It depresses me that we have to find new silly acronyms to mean something we already had acronyms for in the first place, just because we are simply too stupid to use our vocabulary appropriately.

AI is what "AGI" means. Just fucking AI. It has been for more than half a century, it is sensical, and it is logical.

However, in spite of its name, the current technology is not really capable of generating information, so it isn't capable of actual "intelligence". It is pseudo-generation, which it achieves by sequencing and combining input (AKA training) data. So it does not generate new information, but rather new variations of existing information. Due to this fact, I would prefer the name of "Artificial Adaptability" (or "AA", or " A2") to be used in lieu of "AI", or "Artificial Intelligence" (on the grounds that it means something else entirely).

Edit: to the people it may concern: stop answering this about "Artifishual GeNeRaL intelligence". I know what AGI means. It takes all of 3 seconds to do an internet search, and it isn't even necessary: everyone has known for months. I did not bother to explicit it, because I did not imagine that anyone would be simple enough to take literally the first word starting with "g" from my comment and roll with that in a self-important diatribe on what they imagined I was wrong about. So if you feel the need to project what you imagine I meant, and then correct that, please don't. I'm sad enough already that humanity is failing, I do not need more evidence.

Edit 2: "your opinion only matters if you have published papers". No. Also it is a really stupid argument from authority. Besides, anyone with enough time on their hands can get papers published. It is not a guarantee of quality, but merely a proof that you LARPed in academy. The hard part isn't writing, it is thinking. And as I wrote before, I already know this, I need no more proof, thank you.

[–] [email protected] 0 points 8 months ago (1 children)

The fact that you think "AGI" is a new term, or the fact that you think the "G" stands for "Generative" shows how much you know about the field, so maybe you should go read up on literally any of it before you come at me with this attitude and your "due to this fact" pseudo-intellectual bullshit.

The "G" stands for "General", friend. It delineates between an Artificial Intelligence that is narrow in the scope of its knowledge, from intelligences like us that can adapt to new tasks and improve themselves. We do not have Artificial General Intelligence yet, but the ones we have getting there and faster than you could possibly imagine.

Tell me, oh Doctor Of Neuropsychology and Computer Science: how do people learn? How do people generate new information?

Actually no fuck that, I have a better question: define "intelligence". Let's hear it, I've wanted to act the Picard in a Data trial since I was a kid. Since around the time that the term Artificial General Intelligence was coined in fact: nineteen ninety fucking seven.

[–] [email protected] -5 points 8 months ago

the fact that you think the "G" stands for "Generative"

You've shown your IQ right there. No time to waste with you. Goodbye.

[–] [email protected] -1 points 8 months ago

If you haven't published a few papers then your preference in acronyms is irrelevant.

AI comprises everything from pattern recognition like OCR and speech recognition to the complex transformers we know now. All of these are specialized in that they can only accomplish a single task. Such as recognizing graffiti or generating graffiti. AGI, artificial general intelligence, would be flexible enough to do all the things and is currently considered the holy grail of ai.

[–] [email protected] 10 points 8 months ago (1 children)

But either way, I never considered LLMs to be A.I. even if they have the possibility to be great.

It doesn't matter what you consider, they are absolutely a form of AI. In both definition and practice.

[–] [email protected] 2 points 8 months ago (1 children)
[–] [email protected] 5 points 8 months ago* (last edited 8 months ago) (1 children)

sorry for all my spelling mistakes... I fixed a few. I think everyone should spee(a)k binary anyways.

[–] [email protected] 2 points 8 months ago

I’m sorry I’m a spelling spaz.