673
Researchers confirm what we already knew: Google results really are getting worse
(www.theregister.com)
This is a most excellent place for technology news and articles.
Except people are using LLM to generate web pages on something to get clicks. Which means LLM's are training off of information generated by other LLM's. It's an ouroboros of fake information.
But again if you use LLMs ability to understand and generate text via a search engine that doesn't matter.
LLMs are not supposed to give factual answers. That's not their purpose at all.