this post was submitted on 19 Jan 2024
255 points (95.4% liked)

Technology

59030 readers
2976 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Summary: Meta, led by CEO Mark Zuckerberg, is investing billions in Nvidia's H100 graphics cards to build a massive compute infrastructure for AI research and projects. By end of 2024, Meta aims to have 350,000 of these GPUs, with total expenditures potentially reaching $9 billion. This move is part of Meta's focus on developing artificial general intelligence (AGI), competing with firms like OpenAI and Google's DeepMind. The company's AI and computing investments are a key part of its 2024 budget, emphasizing AI as their largest investment area.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 8 points 9 months ago

While I do work in the space, I'm more pessimistic. I think LLM's will allow the tech companies to breach plateaus that they've found with compositional models, but what we will see is other companies catch up to GPT4, perhaps surpassing it a little.

I won't pretend to be an expert on AI, but my view is that we're purely seeing a future where multiple companies will own LLM's. We also won't see many improvements over what we have now, and this is the pessimist in me again, what I think we'll see is that many of the benefits we saw from GPT4 were likely from the fact that their datasets contained an unbelievable amount of PII and stolen data. Without that data, we've seen ChatGPT get worse, and it's one area where researchers and other tech firms have tried to explain the performance gap.