this post was submitted on 11 Sep 2023
402 points (94.3% liked)

Technology

59217 readers
3155 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

https://archive.ph/hMZPi

Remember when tech workers dreamed of working for a big company for a few years, before striking out on their own to start their own company that would knock that tech giant over?

Then that dream shrank to: work for a giant for a few years, quit, do a fake startup, get acqui-hired by your old employer, as a complicated way of getting a bonus and a promotion.

Then the dream shrank further: work for a tech giant for your whole life, get free kombucha and massages on Wednesdays.

And now, the dream is over. All that’s left is: work for a tech giant until they fire your ass, like those 12,000 Googlers who got fired six months after a stock buyback that would have paid their salaries for the next 27 years.

We deserve better than this. We can get it.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 56 points 1 year ago (1 children)

imagine getting first replaced by some kid out of a garage, then by indian code farms and now by ai developed by the grown up kids from said garage and trained by indian code farms.

[–] [email protected] 46 points 1 year ago (5 children)

So tired of this rhetoric. AI isn't replacing any software engineering jobs, nor could it. It's a joke, quite frankly.

[–] [email protected] 36 points 1 year ago (1 children)

They set up a ChatGPT based bot at my work just to help our support agents find information faster. It provides straight up factually false information 80% of the time. A solid 30% of the time, it says the opposite of the truth. It’s completely worthless at all times.

[–] [email protected] 15 points 1 year ago (2 children)

I was listening to a podcast about AI. I think it was one of Ezra Kleins. And he was telling a story that he heard, bout those weird virtual reality games from the 90s or early Aughts. And people shat on those games because they were awful and clunky and not very good so that shitting was well deserved. But one guy was like "yeah, that's all true. But this is the worst it's going to be. The next iteration isn't going to be worse than this."

And that's where AI is now. Like, it's powerful and already a threat to certain jobs. GPT 3/4 may be useless to software engineering jobs now (I'd argue that it's not - I work in a related field and I use it about daily) but what about GPT 5? 6? 10?

Im not as doom and gloom on AI as I was six months ago, but I think it's a bit silly to think that AI isn't going to cause massive upheaval across all industries in the medium to long term.

But also, for the record, I'm less worried about AI than I am about AI in the hands of Capitalism.

[–] [email protected] 1 points 1 year ago

I love how people always argue this point, "oh it sucks and can't replace x". Computer animation sucked once as well, but look at what can be done with it now.

AI sucks in its current state. It will evolve and improve, and put poor little uneducated admin people like myself completely out of work. I'm learning what I can, but I'm not very bright, and neither is my future!

[–] [email protected] 1 points 1 year ago

But also, for the record, I’m less worried about AI than I am about AI in the hands of Capitalism.

Let's just say it, AI in the hands of the 1% who use it to become the 0.001%.

[–] [email protected] 14 points 1 year ago (2 children)

It was impossible for a computer to be smart enough to beat grandmasters at chess, until it wasn't. It was impossible to beat Go Masters at Go, until it wasn't.

No software engineering jobs are getting replaced this year or next year. But considering the rapid pace of AI development, and considering how much code development is just straight up redundant... looking at 20 years from now, it's not so bright.

It would be way better to start putting AI legislation in place this year. That or it's time to start transitioning to UBI.

[–] [email protected] 12 points 1 year ago* (last edited 1 year ago) (3 children)

I am an actual (senior) software engineer, with a background in ML to boot.

I would start to worry if we were anywhere close to even dreaming of how AGI might actually work, but we're not. It's purely in the realm of science fiction. Until you meet the bar of AGI, there's absolutely no risk of software engineering jobs being replaced.

Go or Chess are games with a fixed and simple ruleset and are very suited to what computers are really good at. Software engineering is the art of making the ambiguous and ill-defined into something entirely unambiguous and precisely defined, and that is something we are so far from achieving in computers it's not even funny. ML is ultimately just applied statistics. It's not magic, and it's far from anything we would consider "intelligence".

I do think we need legislation targeting ML, but not because of "omg our jobs". Rather we need legislation to combat huge tech companies vacuuming any and all data on the general public and using that data to manipulate and control the public.

Also, LOL at "how much code development is straight up redundant". If you think development amounts to just writing a bunch of boilerplate as though we were some kind of assembly line putting together the same thing over and over again, you're sorely mistaken.

[–] [email protected] 9 points 1 year ago

I think you overestimate what the average software developer is doing.

Do I think in 10 years ai will be patching the Linux kernel or optimizing aws scaling functions, no. Do I think it will be creating functional crud apps with Django or Ruby on rails, yes, and I think that's what a large amount of software developers are doing. Even if it's not a majority a lot of the more precarious developers without a cs degree will probably lose their job. Not every developer is a senior engineer working on ML.

[–] [email protected] -1 points 1 year ago

It's purely in the realm of science fiction.

This isn't proof of anything, I would just like to point out that a lot of science fiction has become reality in the last few decades.

Go or Chess are games with a fixed and simple ruleset

At the end of the day, what is a computer except a machine with a fixed and simple ruleset: logic gates.

ambiguous and ill-defined into something entirely unambiguous and precisely defined, and that is something we are so far from achieving in computers it's not even funny

You don't need AI to write you perfect C or JavaScript or HTML. You just need it to create an interface for an end user to make the computer do what they want. I predict the AI itself won't write the languages, it will tend to replace the languages. Many orders of magnitude more computationally expensive, but the hardware is quickly becoming cheaper to buy than paying software engineers.

If you think development amounts to just writing a bunch of boilerplate as though we were some kind of assembly line putting together the same thing over and over again, you're sorely mistaken.

Obviously not, that's why libraries and OOP and frameworks exist, I'm aware, not pretending like I have anything to teach you about it either.

And I'll take the L if you have the insider knowledge that there's a requirement for massive creativity behind the scenes in widespread fundamental overhauls of the way software works. But afaik, the fundamentals of code haven't changed in decades. The way users interact has not changed much since smartphones became standard. I don't see a capitalistic incentive to pay for lots of new creativity, instead of just making usable products.

[–] [email protected] 2 points 1 year ago

It was impossible for computers to beat chess and go masters when the computers were trying to play like humans -trying to model high level understanding of strategy and abstract values. The computers started winning when they got fast enough to brute force games - to calculate all of the possible outcomes from all of the possible moves, and to choose the best one.

This is basically the same difference between LLMs and 'true' general AI. The LLMs are brute forcing the next line of a screenplay, with no way to incorporate abstract concepts like truth or logic. If you confuse an LLM for an AI, then you're going to be disappointed in its performance. If you accept that an LLM is a way to average past communications, and accept that a lot of its training set were fiction, then it's an amazing tool for generating consensus text (given that the consensus includes fantasies and lies). It's not going to write new code, but it will give you an approximation of all the existing examples of some algorithm. An approximation that may introduce errors, like copy-pasting sequential lines from every stackexchange answer.

Computer graphics, computer game opponents, they're still doing the same things they were doing decades ago, and the improvements are just doing it all faster. General AI needs to do something different than LLMs and most other ML algorithms.

[–] [email protected] 2 points 1 year ago (1 children)

Not yet, but would you agree that businesses desire the ability to automate software engineering and reduce developer headcount by demanding an AI supplemented development work flow?

[–] [email protected] 2 points 1 year ago (1 children)

Sure, just like businesses have always wanted "no-code" solutions to their problems to cut out the need for software engineers. We all know how that turned out. There was no threat then, and there's no threat now.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

AI coding is just another tool developers have at their disposal now. It will just raise the bar for expected output. I expect within a few years it will be popular to describe a process, have an AI tool spit out some intern-grade hot mess that maybe compiles, then have a junior developer fix it, and a senior developer write the custom/complex parts. If the AI is good enough, it'll be a significant time saver for it to get you more than half way to done.

It could even be tamed with a test-driven development approach. Write a bunch of good tests and have the AI generate code that passes the tests. What could possibly go wrong... lol

[–] [email protected] 2 points 1 year ago

I find it highly overrated in terms of productivity in general, particularly when writing anything remotely non-trivial/company-specific.

There's also the absolutely massive issue of licensing/IP/etc. Any company that's not full of dumbasses should recognize the massive risk and liability involved and stay the fuck away.

[–] [email protected] 2 points 1 year ago (1 children)

Now that I use github copilot, I can work more quickly and learn new frameworks more with less effort. Even its current form, LLMs allow programmers to work more efficiently, and thus can replace jobs. Sure, you still need developers, but fewer of them.

[–] [email protected] 2 points 1 year ago

Learning frameworks has never been hard, and frankly does not make up the majority of a developer's job. Maybe you do it while onboarding. Big whoop. Any good developer can do that fairly easily, and LLMs are entirely superfluous. Worse yet, since they are so commonly confidently incorrect, you have to constantly check if it's even correct. I'd prefer to just read the documentation, thanks.

A mature engineering organization is not pumping out greenfield projects in new languages/frameworks all the time. Greenfield is usually pretty rare, and when you do get a greenfield project, it's supposed to be done using established tools that everyone already knows.A tiny fraction of a developer's job is actually writing code. Most of it is the soft skills necessary to navigate ambiguous requirements and drive a project to completion. And when we do actually program, it's much more reading code than it is writing code, generally to gain enough understanding of the system in order to make a minor change.

LLMs are highly overrated. And even if it does manage to produce something useful, there's much more to a codebase itself. There's the socialization of knowledge around it and the thought process that went into it, none of which you gain when using an LLM. It's adequate for producing boilerplate no one reads anyway, but that's such a small fraction of what we even do (and hopefully, you can abstract away that boilerplate so you're not writing it over and over again anyway).