That’s the P in ChatGPT: Pre-trained. It has “learned” based on the set of data it has been trained on, but prompts will not have it learn anything. Your past prompts are kept to use as “memory” and to influence output for your future prompts, but it does not actually learn from them.
Chatgpt5 can count the number of 'r’s, but that’s probably because it has been specifically trained to do so.
I would argue that the models do learn, but only over generations. So slowly and specifically.
They definitely don’t learn intelligently.
That’s the P in ChatGPT: Pre-trained. It has “learned” based on the set of data it has been trained on, but prompts will not have it learn anything. Your past prompts are kept to use as “memory” and to influence output for your future prompts, but it does not actually learn from them.
The next generation of GPT will include everyone’s past prompts (ever been A/B tested on openAI?). That’s what I mean by generational learning.