Icalasari

joined 1 year ago
[–] [email protected] 8 points 7 months ago (2 children)

Eh, LLMs do have a significant problem in how they can generate false information by themselves. Every other tool prior requires a person to make said false information, but LLMs can just generate it when asked a question

[–] [email protected] 1 points 7 months ago (1 children)

There are already plastic eating bacteria

So at least there's that

[–] [email protected] 5 points 8 months ago

You are downvoted, but you are right that at least some do this

ToS are generally not binding as it's not expected for the average person to actually read through the dense language. There is precedent for this