The context of ChatGPT is about 3000 words (4000 tokens) long. (Source: OpenAI)
Therefore you should always include all important parameters, style info, etc. in the original prompt.
This is how the idea for AIPRM came about, because a prompt is just not the one-liner we always see in the many LinkedIn carousels or long lists and get sold as the AI miracle cure.
Prompts are the new code.
Prompts themselves can often be a full screen long if they produce very intriguing results… the Midjourney prompts in AIPRM are a good example.
Real “teaching” - i.e. “fine tuning” of the model is not supported by ChatGPT (3.5) (yet), only the old GPT3 API.
It is also questionable whether this much more complex “tuning” will be available for free or as cheap as ChatGPT Plus.
The best solution for good and consistent results are therefore comprehensive, complex prompts that do not expect any context (“memory”).
Q: But you can layer the prompts and build on top of the model in one session with multiple prompts no?
A: yes, as long as the combined result is <= 3000 words
Q: So all of the prompts combined shouldn’t exceed 3k words? Wow I didn’t know that!
A: yes, once you’re over that it forgets things, changes topic, etc.