Best way to create a report/ in depth piece of writing out of thorough AIPRM generated prompt

I’ve used the ‘improve my prompt’ prompt to generate a wonderful in-depth prompt for a report I want to produce. The prompt consists of a full description/overview of what is needed and a taxonomy/ structure for the report.

My question is- what now is the best way to create this report? I know I could copy and paste the main prompt and then individually each subtitle into chatgpt, and have done this. but is there a better way? I’d be really grateful if anyone can suggest a better way than just copying and pasting an extensive prompt into regular chatgpt. I’ve tried a couple of the articlewriting promots on AIPRM but after the first section when it finished writing I typed continue, and then it replied about the word continue, instead of continuing like normal chatgpt does.

I should add I’m not expecting a press one button and spew out a long 100,000 report here. I’ve already produced in my own writing a long report on the topic and am looking to do this using AIPRM/ chatgpt to see what I’ve missed, and make it better, and also to learn for future reference. Happy to do the work :slight_smile:

I’d be grateful for any tips or advice on this! Thank you :slight_smile:

1 Like

you can prompt him { continue your last reply from where it stopped ) instead of the button continue

1 Like

Yes I’d tried that, it normally works but on this doesn’t change it! thanks anyhow

1 Like

By conscious and deliberate design, mostly around safety and preserving the integrity of the AI, ChatGPT imposes limits on the ‘working memory’ of its prompts, which prevents it being ‘trained’ by those prompts into doing things it is not supposed to.

Using GPT3.5 that limit is just 4,000 tokens, which equates to around 3,000 words in total, that’s your prompts and its responses combined of course. 4,000 tokens total working memory. Obviously, if you reach that limit it cannot continue from the stuff outside of (before) the memory it has because it has forgotten it, or at least parts of it, to bring it down to the size of its memory.

Using GPT4 gives a much larger amount of tokens, but I believe the more detailed and granular system means that more tokens are processed per 1,000 words, and thus 4,000 tokens in GPT4 may be noticeably less words, because it has more understanding and connections on the words. Even so, it is definitely a far more generous memory allowance and can handle tasks that GPT3.5 really couldn’t.

There are no prompts that work around this memory limit. There are some prompts that attempt to shorten and contract the prompts into shorter, less detailed ones, summaries and such, but obviously just like with lossy compression in images, you will lose granularity and detail in this process.

1 Like