I was merrily uploading a major report having split it into 300-plus chunks. When I got to 210 chunks, it told me, in so many words, “sorry, you need to start a new conversation; this one is too full.” Shame. If anybody knows how I can complete my uploads, please advise. Regardless, this is a “look out for” suggestion that you might get “chunked”. Does anybody have a similar experience?
To a very real extent you already know that you were using a workaround to go against the limits that OpenAI set on their product deliberately and with reason. They are a business, even as the ‘Limited Profit’ status (and even a non-profit is still run as a business and has to balance the books).
The fact is that they sell access to GPT3, and will doubtless sell access to GPT4, which while expensive, is specifically suited to the kind of purpose required - training GPT on your own corpus of large data, rather than the small chatty type use that ChatGPT is built for. They don’t want ChatGPT cannibalizing the market for those products built for those specific purposes.
If you can afford to wait a few weeks, I’d certainly suggest seeing what the prices for access to GPT4 are like, and considering that as your way forward. It will certainly cost more, but that’s exactly why they do it that way.
Otherwise you are just going to have to get more creative about how you break down into smaller chunks, such as breaking into 300 chunks, and summarizing the output of every 10 chunks or so, like slowly boiling it down and reducing the total volume you need to deal with.
I appreciate your quick response.
I did not know that I was doing anything of the kind that you are claiming. I have read a lot of content about testing including uploading tremendous amounts, including but not limited to depositions, reports, etc. I saw they entered into an arrangement with a company to offer legal service info and I wanted to test the accuracy as GPT currently stands for that specific purpose.
You make many false assumptions, such as “and considering that as your way forward.” I don’t in any way begrudge them from wanting to monetize their efforts and resources. I pay them the $20, and I appreciate the awesomeness of the technology. For those people like myself who have not analyzed the business angle of Chatgpt or similar systems, as apparently, you have, this situation could be a big surprise as it was to me. I did tell Open AI they should warn if they felt their system was being overloaded. I suggested they provide some gauge if a limit was being approached. I read about splitter on this forum and thought it was a good tool. Again, what I was doing was completely exploratory, nothing more.
Again, I appreciate your response and thank you for emphasizing their business necessities. And yes, I look forward to Chatgpt4.
Please accept my apologies if my concise style of attempting to put a lot of clear info into as few words as necessary sometimes seems a little terse. And I absolutely apologise if you took any offense at my assumption that you’d know many of the same things I did. Personally, I’ve always hated when people assume I know less than I do and treat me like I’m dumb, so I have always tried to communicate with people as peers myself. Absolutely no offense was intended.
For the record, I have never felt that the integration of chatbots into search engines was the big turning point, nor the end-game. The real power of AI is our ability to turn it to our own specific uses, not just as set out by others, but in using the language models themselves, training it on our own chosen corpus of data, and training it to the exact purposes we require.
That’s something that GPT3 and GPT3.5 are already doing, of course, even though they are much smaller and less sophisticated models than what is already coming with GPT4 and with Bard, and with hundreds and hundreds of other applications that are in the pipeline.
If you search these forums, you’ll find some mentions of Pinecone.io by @RealityMoez and how, in combination with GPT3 you can have a theoretically endless or bottomless memory system (hardware limits will be the only limits), and thus something that could handle an entire book, or indeed, an entire library of thousands of books. It’s fascinating stuff, and these are still just the first tentative ‘toddler steps’ of what is to come.
Thank you very much. Yes, communicating online, whether via email, chat, forum, or video is always a challenge.
I agree with your analysis of chatbots and search engines. Indeed, I have spent a tremendous amount of time testing and experimenting to understand how it can be used. Every time I do such, I get more and more blown away. I have tested many of the new tools, extensions, and apps related to Chatgpt and AI. I probably have 3 apps on my phone, the latest one is using chatgpt and voice. My list of extensions is longer than they have ever been as I test with chrome and edge. Every time a new chatgpt-like system emerges, I want to test it and compare it. It is all so very exciting to see how we can take advantage of this marvelous tool that I personally never thought I would ever witness in my lifetime.
I will check those forums. Complicating the memory system needs, I believe, is the confluence of voice and video. It is all so amazing.