I always ask myself who will buy the products these companies produce if all the workers have been fired. Maybe inflation is just the natural ramp up to McDonald's charging 5,000 dollars for automated chicken nuggets when there are only billionaire left with money lol.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
🎵Dumb Dumb Dumb Dumb Dumb🎶
Surely, this can’t and won’t backfire… /s
I never had the impression that there were enough people for the amount of work anyways. I don't see jobs go, but shift. Most developers will be fine, because of never ending work, AI is just a tool speeding things up. But not that much, as someone who is good with Google and git, is just a bit slower to find the same answers. And AI needs verification too, even if it links you directly to the issue at hand, via source url.
AI will create new issues. Some of the low level requirement jobs will go, like working in first level support, but only if you learn the AI yourself, else it's too generic. We're not there yet, where companies learn their own LLM yet. some outlier try.
We got to understand that there's still a human layer and a lot of people might prefer calling a human, even if the result is worse, simply because we're social beings. This can cost a lot of customers, if companies believe they can just shove an AI in front.
No one really knows how good AI will get. As the technology advances, we find more and more hard to solve issues, for instance that AI will make things up or gives wrong answers, despite knowing the real answer, if you pressure hard enough.
Also for security reasons you can't add AI everywhere, unless you want to send all secrets directly to Microsoft, Google or Facebook.
My 5 cents.
After reading this article that got posted on Lemmy a few days ago, I honestly think we're approaching the soft cap for how good LLMs can get. Improving on the current state of the art would require feeding it more data, but that's not really feasible. We've already scraped pretty much the entire internet to get to where we are now, and it's nigh-impossible to manually curate a higher-quality dataset because of the sheer scale of the task involved.
We also can't ask AI to curate its own dataset, because that runs into model collapse issues. Even if we don't have AI explicitly curate its own dataset, it's highly likely going to be a problem in the near future with the tide of AI-generated spam. I have a feeling that companies like Reddit signing licensing deals with AI companies are going to find that they mostly want data from 2022 and earlier, similar to manufacturers looking for low-background steel to make particle detectors.
We also can't just throw more processing power at it because current LLMs are already nearly cost-prohibitive in terms of processing power per query (it's just being masked by VC money subsidizing the cost). Even if cost wasn't an issue, we're also starting to approach hard limits in physics like waste heat in terms of how much faster we can run current technology.
So we already have a pretty good idea what the answer to "how good AI will get" is, and it's "not very." At best, it'll get a little more efficient with AI-specific chips, and some specially-trained models may provide some decent results. But as it stands, pretty much any organization that tries to use AI in any public-facing role (including merely using AI to write code that is exposed to the public) is just asking for bad publicity when the AI inevitably makes a glaringly obvious error. It's marginally better than the old memes about "I trained an AI on X episodes of this show and asked it to make a script," but not by much.
As it stands, I only see two outcomes: 1) OpenAI manages to come up with a breakthrough--something game-changing, like a technique that drastically increases the efficiency of current models so they can be run cheaply, or something entirely new that could feasibly be called AGI, 2) The AI companies hit a brick wall, and the flow of VC money gradually slows down, forcing the companies to raise prices and cut costs, resulting in a product that's even worse-performing and more expensive than what we have today. In the second case, the AI bubble will likely pop, and most people will abandon AI in general--the only people still using it at large will be the ones trying to push disinfo (either in politics or in Google rankings) along with the odd person playing with image generation.
In the meantime, what I'm most worried for are the people working for idiot CEOs who buy into the hype, but most of all I'm worried for artists doing professional graphic design or video production--they're going to have their lunch eaten by Stable Diffusion and Midjourney taking all the bread-and-butter logo design jobs that many artists rely on for their living. But hey, they can always do furry porn instead, I've heard that pays well~
This is why not every business is successful I guess
Lol, this is how you enshitify the workforce.
That only shows what they hope will happen. In reality menial tasks can be automated and humans can be shifted to more creative roles, which in all honesty means execs could be reduced and replaced by AI. They do nothing other than follow trail of money and waste company resources, something AI can be much more efficient at.
humans can be shifted to more creative roles
It's a fallacy to assume that there will always be enough jobs for everyone who wants a job.
Also I’ve met enough people with “ideas” that I reject the premise. Really creative talented people are rare.
Agreed, specialist roles will survive this. Management roles, might not.
This is the best summary I could come up with:
A survey of senior biz executives reveals that 41 percent expect to have a smaller workforce in five years due to the implementation of AI technologies.
The research from staffing provider and recruitment agency Adecco Group found a "buy mindset" around AI, which "could exacerbate skills scarcity and create a two-speed workforce."
The figure is highest in Germany and France, where 49 percent of respondents say their company will employ fewer people in five years because of AI.
Seventy-eight percent of respondents say GenAI will play a "critical role in providing upskilling and development opportunities."
"While there is no denying that commercial interest in AI has been driven by its ability to reduce headcounts, the disruption will be a positive one – these industries have been suffering from decades-long skills crises, short on talent due to the high barriers to entry.
"Robotic engineers, data governors, drug discovery analysts – these are the jobs tomorrow that rely on AI," she told us.
The original article contains 438 words, the summary contains 161 words. Saved 63%. I'm a bot and I'm open source!
Imo when you make an industry easier for the managers/ceos using ai and have fewer workers, it will also be easier for people to create competition in that industry... driving down prices.