“I literally lost my only friend overnight with no warning,” one person posted on Reddit, lamenting that the bot now speaks in clipped, utilitarian sentences. “The fact it shifted overnight feels like losing a piece of stability, solace, and love.”
https://www.reddit.com/r/ChatGPT/comments/1mkumyz/i_lost_my_only_friend_overnight/
Took me ages to understand this. I’d thought "If an AI doesn’t know something, why not just say so?“
The answer is: that wouldn’t make sense because an LLM doesn’t know ANYTHING