This is ultimately a ‘wrong tool’ issue @Wladimir_J_Alonso
It’s kind of like you have a huge pile of logs and the perfect plans to build a cabin out of them… But you want to know how to do it when the only tool you have is a Swiss Army Penknife. It’s a really expensive, top-of-the-line Swiss Army Knife, with the scissors, and a little saw-tooth blade, assorted screwdrivers, and more other attachments and gizmos than you can easily name, but it is still just a penknife.
ChatGPT is a mass-market product built as a utilization of the larger GPT engine. It has set limits of all kinds on it that deliberately reduce its capacities, and thus system demands, so that it can serve more people at a time. The prompt size limit, for example, was deliberately added to prevent the AI learning and being retrained by any one user (because that would spoil its general purpose for all other users).
Chatbots are, ultimately, designed merely for chat use. Casual, easy prompting and back and forth. It can help with many projects and tasks, but it is NOT an ‘industrial’ product. For your more extensive, non-chatty kind of application, you could attempt to find little workarounds, just like someone with a penknife could slowly whittle their way through a wooden log, but it is still the wrong tool for the job.
You might find that you could get what you want from licensing GPT4 itself (not the chat application with the added limits, but the GPT model itself), as that gives you a lot, lot more leeway in terms of what size of tasks it can handle, what further training it can take, etc. However, the licensing while very reasonable isn’t cheap, and the hardware needs to further train and run your own raw LLM are pretty extensive. Still worth looking into, but potentially out of the price range you’d find acceptable for the value of the project.