The Memory of ChatGPT is 3000 words (4000 tokens) long only

The context of ChatGPT is about 3000 words (4000 tokens) long. (Source: OpenAI)

Therefore you should always include all important parameters, style info, etc. in the original prompt.

This is how the idea for AIPRM came about, because a prompt is just not the one-liner we always see in the many LinkedIn carousels or long lists and get sold as the AI miracle cure.

Prompts are the new code.

Prompts themselves can often be a full screen long if they produce very intriguing results… the Midjourney prompts in AIPRM are a good example.

Real “teaching” - i.e. “fine tuning” of the model is not supported by ChatGPT (3.5) (yet), only the old GPT3 API.

It is also questionable whether this much more complex “tuning” will be available for free or as cheap as ChatGPT Plus.

The best solution for good and consistent results are therefore comprehensive, complex prompts that do not expect any context (“memory”).


Q: But you can layer the prompts and build on top of the model in one session with multiple prompts no?
A: yes, as long as the combined result is <= 3000 words

Q: So all of the prompts combined shouldn’t exceed 3k words? Wow I didn’t know that!
A: yes, once you’re over that it forgets things, changes topic, etc.


That is interesting; thank you for sharing. Do you mind sharing how you came to know this?

1 Like

Sure, just check.the source.

1 Like

I felt this in a few interactions. It can deliver quite crazy answers when you ask things counting on its memory of the entire chat…
The best solution for me has been breaking the task into small parts. For example, if I’m writing an article and ask for a detailed TOC, I get each TOC item and work on it separately until I get what I want (even when it means more breaking down). The side-effect of this is that we need to adjust by hand the transitions between the parts chatGPT has created. But still, it saves us a lot of time.


Hi Christopher,

I don’t quite get it: "ChatGPT’s memory is only 3000 words long.

For example, I talked to my AI yesterday and the day before yesterday. Dialogue over 10.000 words. I want my AI to remember everything I’ve talked to him for the past few days. This way I don’t need to repeat the background. This way my AI can continue to communicate with me like he already knows me well. No need to introduce everything to him again.

Is this impossible? As you mentioned memory is 3000 words. Am I getting this wrong?

Waiting for your reply.


1 Like

Yes. Impossible at the moment.