this post was submitted on 08 Nov 2023
128 points (97.8% liked)

Technology

59378 readers
3905 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 16 comments
sorted by: hot top controversial new old
[–] [email protected] 31 points 1 year ago* (last edited 1 year ago) (1 children)

Of course, they must understand every mouse click and key stroke you make in Windows.

Migrate to Linux.

[–] [email protected] 9 points 1 year ago (1 children)

Er, Bing Chat is AI-assisted search, not analytics.

[–] [email protected] 21 points 1 year ago

The destination is ultimately the same https://azure.microsoft.com/en-us/blog/introducing-microsoft-fabric-data-analytics-for-the-era-of-ai/

Without reading the privacy policy of bing ai chat I can already feel assured in assuming that it has provisions to allow its data to be stored, used, and analyzed to track users with something like Fabric

[–] [email protected] 25 points 1 year ago (1 children)

Lol there is no way in hell this is sustainable

[–] [email protected] 27 points 1 year ago (1 children)

Environmentally or economically? Actually, which one you mean doesn't matter, cos it's not either way:

[–] [email protected] 14 points 1 year ago

That second one is hilarious, because base CoPilot is absolute garbage. Less helpful than Intellisense.

I've had some very niche use cases for CoPilot Chat, but even that is just hidden away gathering dust most of the time.

[–] [email protected] 18 points 1 year ago (1 children)

Yes. This seems VERY profitable and sustainable. Yes indeed.

[–] [email protected] 7 points 1 year ago (1 children)

It probably is very profitable and it doesn't need to be sustainable. This is likely a short term (3 years is short) deal.

This is the tip off for me:

In this case, Microsoft is using the system alongside its Azure Kubernetes Service to orchestrate Oracle's GPU nodes to keep up with what's said to be demand for Bing's AI features.

This doesn't really look like this is about Bing AI features but rather GPUs are in demand so much that MS can "rent in bulk" from Oracle, and then rent out Azure's own GPUs to Azure customers willing to pay per second for GPU usage at retail prices.

[–] [email protected] 3 points 1 year ago (1 children)

I mean all the processing needed for AI in general. its the new.... "big thing" they'll find some use. Give it to everyone. realize it won't make money if its free. Try to find a way to charge. It won't be good enough for that. People will move on. It'll end up in its niches where it makes senses and is considered a cost savings, and the world will move on. A ton of firms will merge and go bankrupt and then all the big tech companies will move onto the next "big thing".

[–] [email protected] 2 points 1 year ago

Give it to everyone. realize it won’t make money if its free.

I don't think that's whats happening fully right now. AI solutions given away for free are users doing free work training AI. The users are the product, not the AI being used in those cases.

It’ll end up in its niches where it makes senses and is considered a cost savings, and the world will move on.

This is a lot of niches. I'm seeing quite a few jobs being replaced by AI, and its going to have a massive impact on economies. Its even worse than just "jobs lost". Its lots of low and middle skill jobs that AI can replace today. It going to eviscerate the pipeline of people that grow to higher positions. What then happens when more of the most skilled are needed? They won't exist because they weren't able to grow in the junior and middle skillset jobs replaced by AI. I'm not sure what we can do about that.

[–] [email protected] 12 points 1 year ago

Hey they can have my old GPU if they give me a new blank laptop.

I've always wanted to try linux

[–] [email protected] 5 points 1 year ago

This is the best summary I could come up with:


Demand for Microsoft's AI services is apparently so great – or Redmond's resources so tight – that the software giant plans to offload some of the machine-learning models used by Bing Search to Oracle's GPU supercluster as part of a multi-year agreement announced Tuesday.

The partnership essentially boils down to: Microsoft needs more compute resources to keep up with the alleged "explosive growth" of its AI services, and Oracle just happens to have tens of thousands of Nvidia A100s and H100 GPUs available for rent.

Microsoft was among the first to integrate a generative AI chatbot into its search engine with the launch of Bing Chat back in February.

You all know the drill by now: you can feed prompts, requests, or queries into Bing Chat, and it will try to look up information, write bad poetry, generate pictures and other content, and so on.

In this case, Microsoft is using the system alongside its Azure Kubernetes Service to orchestrate Oracle's GPU nodes to keep up with what's said to be demand for Bing's AI features.

Oracle claims its cloud super-clusters, which presumably Bing will use, can each scale to 32,768 Nvidia A100s or 16,384 H100 GPUs using a ultra-low latency Remote Direct Memory Access (RDMA) network.


The original article contains 580 words, the summary contains 207 words. Saved 64%. I'm a bot and I'm open source!

[–] [email protected] 4 points 1 year ago

I heard rumors that azure ran on oracle, this is probably why. Microsoft's pursuit of advanced chatbot technology is surely to be a loser in the long run.

[–] [email protected] 2 points 1 year ago (2 children)

So when will CPUs integrate the hardware necessary to compete with GPUs on these tasks? This situation is ridiculous, the device designed for this isn't able to keep up with the device designed for something else entirely

[–] [email protected] 24 points 1 year ago* (last edited 1 year ago)

You are looking at it wrong by taking the names too literally. GPUs are simply processing units optimized for parallel computation and CPUs processing units optimized for general purpose sequential computation. And these optimizations require architectural trade-offs, so to be efficient at both types you'll need to have both a CPU and GPU.

So think of it this way, a CPU is actually a General-purpose Sequential Processing Unit and a GPU is a Parallel Processing Unit, but renaming them would only add to the confusion.

[–] [email protected] 8 points 1 year ago

GPUs are a lot closer to AI processors (tensor cores and similar) than CPUs. Graphics processing is about doing lots of simple computations simultaneously, which is what AI does - lots and lots of matrix maths. CPUs are more general purpose but can't compete on raw speed because of this (and some of the hacks to try to get more speed are causing security problems).