this post was submitted on 22 Nov 2023
45 points (95.9% liked)

Asklemmy

43757 readers
2006 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_[email protected]~

founded 5 years ago
MODERATORS
 

I have a theory that it should have a very different "personality" (probably more like writing style) depending on language because it's an entirely different set of training data

In English chatGPT is rather academic and has a recognisable style of writing, if you've used it a bit you can usually get hints something was written by it just by reading it.

Does it speak in a similar tone, with similar mannerisms in other languages? (where possible, obviously some things don't translate)

I don't know a second language well enough to have natural conversation so I'm unable to test this myself, and may have worded things awkwardly from a lack of understanding

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 5 points 11 months ago (1 children)

No single sentence is wrong, but overall it sounds unnatural and has none of the "flavor" of the language.

I've also found that it's often contextually wrong. Like it doesn't know what's going on around it or how to interpret the previous paragraph or even the previous sentence, let alone the sentence two pages back that was actually relevant to the sentence it's now working on.

[โ€“] [email protected] 1 points 11 months ago

Well probably because it does not know what's going on around it. It only knows the words. It can't interpret the words, only guess what is the most likely answer word by word.