this post was submitted on 09 Feb 2024
461 points (96.0% liked)

Linux

47361 readers
1150 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

If your IP (and possible your browser) looks "suspicious" or has been used by other users before, you need to add additional information for registration on gitlab.com, which includes your mobile phone number and possibly credit card information. Since it is not possible to contribute or even report issues on open source projects without doing so, I do not think any open source project should use this service until they change that.

Screenshot: https://i.ibb.co/XsfcfHf/gitlab.png

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 68 points 7 months ago (7 children)

On a tangent, why are all of these companies pushing AI programming? This shit isn't nearly as functional as they make it seem and all the beginners who try it are constantly asking questions about why their generated code doesn't work

[–] [email protected] 64 points 7 months ago (1 children)

We are in the hype cycle so everyone is going bananas and there's money to be made prior to the trough of disillusionment.

[–] [email protected] 5 points 7 months ago* (last edited 7 months ago)

Haha so true.

I tried to use chatgpt to convert a monstrosity of a SQL query to a sqlalchemy query and it failed horribly.

[–] [email protected] 42 points 7 months ago* (last edited 7 months ago)

It's their wet dream. Making software without programmers.

Execs have never cared about the technology or the engineering side of it. If you could make software by banging on a pot while dancing naked around the fire, they'd have been ok with that.

And now that AI has come along that's basically what it looks like to them.

[–] [email protected] 27 points 7 months ago

VC's and companies like OpenAI have done a really good job of propagandizing AI (LLMs). People think it's magical and the future, so there's money in saying you have it.

[–] [email protected] 25 points 7 months ago

Because it brings in mad VC funding

[–] [email protected] 17 points 7 months ago* (last edited 7 months ago) (2 children)

the beginners who try it are constantly asking questions about why their generated code doesn’t work

Because it ain't here to generate all their code for them. It's a glorified autocomplete and suggestion engine. When are people gonna get this? (not you, just in general)

I use CoPilot myself, but if you have absolutely no idea what you're doing yourself, you and CoPilot will both quickly hit a dead end together. It doesn't actually understand what you want the code to do. Only what is similar to what you have already written or prompted for, which may be some garbage picked up from a noob on the web somewhere. Books and research using your meatbrain are still very much needed.

[–] [email protected] 7 points 7 months ago

It's not in the interest of all the techbros to sell the new age AIshit as something less that can only do such small thing. They need to hype the shit out of it to get all the crazy investors money that understand nothing about it but only see AI buzzwords everywhere and need to go for it now because of FOMO.

It's only gonna get much worse before it is toned down to appropriate usage.

[–] [email protected] 2 points 7 months ago* (last edited 7 months ago)

Don't even need to make it about code. I once asked what a term meant in a page full of a certain well known FOSS application's benchmarks page. It gave me a lot of garbage that was unrelated because it made an assumption about the term, exactly the assumption I was trying to avoid. I try to deviate it away from that, and it fails to say anything coherent and then loops back and gives that initial attempt as the answer again. I was stuck unable from stopping it from hallucinating.

How? Why?

Basically, it was information you could only find by looking at the github code, and it was pretty straightforward - but the LLM sees "benchmark" and it must therefore make a bajillion assumptions.

Even if asked not to.

I have a conclusion to make. It does do the code thing too, and it is directly related. Once asked about a library, and it found a post where someone was ASKING if XYZ was what a piece of code was for - and it gave it out as if it was the answer. It wasn't. And this is the root of the problem:

AI's never say "I don't know".

It must ALWAYS know. It must ALWAYS assume something, anything, because not knowing is a crime and it won't commit it.

And that makes them shit.

[–] [email protected] 13 points 7 months ago* (last edited 7 months ago)

Because greedy investors are gullible and want to make money from the jobs they think AI will displace. They don't know that this shit doesn't work like they've been promised. The C-levels at Gitlab want their money (gotta love publicly traded companies), and nobody is listening to the devs who are shouting that AI is great at writing security vulnerabilities or just like, totally nonfunctioning code.

[–] [email protected] 1 points 7 months ago

I'm hyped about AI assisted programming and even agent driven projects (writing their own code, submitting pull requests etc) but I also agree that it seems just too early to actually put money behind it.

Its just so marginal so far, the UI/HMI has too much friction still and the output without skilled programming assistance is too limited.