this post was submitted on 13 Nov 2023
192 points (93.2% liked)

Comic Strips

12611 readers
3237 users here now

Comic Strips is a community for those who love comic stories.

The rules are simple:

Web of links

founded 1 year ago
MODERATORS
all 24 comments
sorted by: hot top controversial new old
[–] [email protected] 21 points 1 year ago

Relevant XKCD https://xkcd.com/1623/

Though don't get me started on how many things are called "holograms."

Pepper's ghost isn't a hologram. It's just a reflection with more steps.

[–] [email protected] 12 points 1 year ago (1 children)

It took me a few minutes to figure out what your Picassoesque reindeer was. I thought it was some kind of deformed moose.

I don't mean to be negative though .... the content and writing is great ... it's just the image of the reindeer was distracting for a minute before I could look through the rest of the comic.

[–] [email protected] 10 points 1 year ago (1 children)
[–] [email protected] 9 points 1 year ago (2 children)

Back in my day, we didn't have no dang hoverboards, and hoverboards were a thing of the future from 2015 when Marty McFly would fly across the silver screen on his one true hoverboard. I spent the best years of my life working on anti-gravity technology with the hope that I could one day too fly like Marty, on a real hoverboard, but now I just sit in my own graveyard of failed dreams while the kids are zoomin' around on their wheely-scooters and calling them hoverboard. Thanks Obama!

[–] [email protected] 2 points 1 year ago

Obama has probably received more thanks than any person since Jesus.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

2016

They said 2015 but anyone with half a brain gets why they lied about exactly which year they were in. And if you and got that then: Cubs

[–] [email protected] 5 points 1 year ago

What is this holo-deck they speak of?

[–] [email protected] 4 points 1 year ago

Well, it's not a surprise that the definition of "AI" is not based on how it is represented in fiction. It shouldn't.
But the definition of AI is still oddly large and include a lot of things that probably shouldn't be part of it.
On the other hand, when people talk about "AI", it's almost always about machine learning, aimed at NLP or vision tasks, which is also inaccurate as AI can do much more than that.

[–] [email protected] 4 points 1 year ago

Why would we base our definition of AI on fiction, rather than using the definition from computer science, which is where the term originated?

[–] [email protected] 3 points 1 year ago (1 children)

I really don't get the last panel. This comic fell flat for me.

[–] [email protected] 5 points 1 year ago (1 children)

I think the joke is just more normal shit being named after something fantastic from sci Fi even though it bears no resemblance to the namesake

[–] [email protected] 0 points 1 year ago (1 children)

Maybe... Still doesn't do anything for me.

[–] [email protected] 3 points 1 year ago

Yeah I mean it feels less like a joke and more just like the creator ranting about something they find annoying

[–] [email protected] 2 points 1 year ago (3 children)

If real holodecks were invented, what stupid brand name would the corporate entity that controls the rights to them be?

[–] [email protected] 2 points 1 year ago

Probably something Meta...

[–] [email protected] 1 points 1 year ago

I'm gonna go with

REAL(tm)

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

ChatGPT behaves very much like the AI we've seen in fiction. You can use it pretty much exactly like the crew of the Starship Enterprise uses the ship's computer.

Edit: star ship to starship

[–] [email protected] 8 points 1 year ago (1 children)

In personally trying to use ChatGPT 4 for a job task (programming), I would disagree strongly with this sentiment. I have yet to find a task where it doesn't partially fail due to no notion of the concepts underlying the topic.

In an example, I asked it to write an implementation of reading from a well known file type as a class. It had many correct ideas for certain operations (compiled from other sources of course), but failed with the basic concept of class instantiation. It was calling class methods in the constructor, which is just not allowed in the language being used. I went through several iterations with it to avail before just giving up on it.

In "normal" language tasks, it seems to be quirky, but passable. But if you give it a highly technical task where nuance and conceptual knowledge are needed? I have yet to see that work in any reliable capacity.

[–] [email protected] 2 points 1 year ago (1 children)

I use it for programming a lot too. You have to explain everything to it like you would a brand new engineer, and then it is often wrong with certain parts like you said. But if you know enough about coding to figure out where it's wrong, and just write those parts yourself, it can still be a huge time saver.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

Yeah, I'd agree that with sufficient iterations and clarifying remarks ChatGPT can produce something close to functional. I was mostly disagreeing with the original comment's sentiment that it could be treated like the computer on the Enterprise. While they had several plot specific flaws, the duotronic computers were generally competent and didn't need everything spelled out for them.