this post was submitted on 16 Dec 2024
37 points (75.3% liked)
Dad Jokes
15718 readers
7 users here now
Description
This is a community for sharing those cheesy “dad” jokes that invoke an eye roll or chuckle.
Rules
- Clean jokes only please. If you cannot tell this joke to a 5-year-old, you probably shouldn’t post it here. Please post edgier jokes to [email protected]
- Must post text, image (e.g., meme), or direct link. Do not post external links that cannot be viewed directly from the community (e.g., link to joke website, Facebook, Instagram, etc.)
- Follow Lemmy.World Code of Conduct
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
ChatGPT is pretty helpful despite the hate. I've found myself using it quite a bit recently. Situations like these where you don't get a joke are good ones in particular, since it's something you might have struggled to figure out just by Googling before. However, you do need to be able to check the output to gain value from it and that's kind of one of its limitations since you sometimes end up needing to do as much research or work verifying what it tells you as you tried to avoid by using it.
In this case, where it's not so much a question of facts and it's more about interpretation, a simple test of asking yourself "does this make sense?" could have provided a clue for you that chatGPT was struggling here. One of its problems is that it just always tries to be helpful and as a function of how it works that often ends up favouring the production of some kind of response over an accurate response even when it can't really produce an answer. It doesn't actually just magically know everything and if you can't confidently explain the joke to someone else in your own words after reading it's "explanation" then the odds are good that it just fed you nonsense which superficially looked like it must mean something.
In this case it seems, that the biggest problem was that the joke itself didn't entirely make sense on its premise, so there wasn't really a correct answer and chatGPT just tried really hard to conjure one where it didn't really exist.
I knew it didn't make that much sense. I just didn't care lol