this post was submitted on 13 Nov 2024
559 points (95.4% liked)

Science Memes

11021 readers
3490 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
top 21 comments
sorted by: hot top controversial new old
[–] [email protected] 3 points 5 hours ago (1 children)

A lot of new tech is not as efficient or equally so at the get go. Learning how to properly implement and utilize it is part of the process.

Right now we are just throwing raw computing power in ML format at it. As soon as it catches and shows a little promise in an area we can focus and refine. Sometimes you need to use the shotgun to see the rabbits ya know?

[–] [email protected] 3 points 5 hours ago

Physicists abhor a black box. So long as it is an option, most will choose not to use AI to any great extent, and will chastise those who do.

[–] [email protected] 6 points 11 hours ago

There's plenty of stuff where ML algorithms the state of the art. For example the raw data from nanopore DNA sequencing machines is extremely noisy and ML algorithms clean it up with much less error than the Markov chains used in years previous.

[–] [email protected] 4 points 14 hours ago
[–] [email protected] 50 points 1 day ago (1 children)

Working with pretrained models implemented in FPGAs for particle identification and tracking. It's much faster and exactly as accurate. ¯\_(ツ)_/¯

[–] [email protected] 20 points 1 day ago* (last edited 1 day ago)

Run, the butlerian jihad is already going your way.

[–] [email protected] 25 points 1 day ago (1 children)

The actual model required for general purpose likely lies beyond the range of petabytes of memory.

These models are using gigabytes and the trend indicates its exponential. A couple more gigabytes isn't going to cut it. Layers cannot expand the predictive capabilities without increasing the error. I'm sure a proof of that will be along within in the next few years.

[–] [email protected] 5 points 1 day ago* (last edited 1 day ago)

"Come on man, I just need a couple more pets of your data and I will totally be able to predict you something useful!".
It's capacitors flip polarity in anticipation.

"I swear man! It's only a couple of orders of magnitude more, man! And all your dreams will come true. I'm sure I'll service you right!"

Well if it needs it, right?

[–] [email protected] 3 points 1 day ago (1 children)

Ai sucks ass, stop using it

[–] [email protected] 6 points 20 hours ago

It doesn't. It's just overhyped.

[–] [email protected] 11 points 1 day ago (1 children)

"There is no free lunch.", is a saying in ML research.

[–] [email protected] 13 points 1 day ago

That's just a saying.

[–] [email protected] 12 points 1 day ago (1 children)
[–] [email protected] 10 points 1 day ago

GET YOUR SHIT TOGETHER, CORAL

[–] [email protected] 6 points 1 day ago

For the meme? The Walking Dead. For the content? No idea.

[–] [email protected] 4 points 1 day ago (1 children)

It is not even faster usually.

[–] [email protected] 15 points 1 day ago

And if it is faster, it just converges to the wrong answer faster

[–] [email protected] 2 points 1 day ago

Pretty much the only thing it's even remotely good for is as a toy.

[–] [email protected] -2 points 1 day ago

So what you're saying, Dad, is it's nascent and already faster? Gotcha.