I’m pretty confident that I have been using the “Plugins” model with a very long context window. I was copy-pasting entire 500-line source files and asking questions about it. I assume that I’m getting the 32k context window.
The entire conversation is over 60,000 characters according to wc. OpenAI’s tool won’t even let me compute the tokens if I paste more than 50k (?) characters, but when I deleted some of it, it gave me a value of >18,000 tokens.
I’m not sure if/when ChatGPT starts to forgot part of the chat history (drops out of the context window) but it still seemed to remember the first file after long, winding discussion.
I’m pretty confident that I have been using the “Plugins” model with a very long context window. I was copy-pasting entire 500-line source files and asking questions about it. I assume that I’m getting the 32k context window.
How many characters is your 500 line source file? It probably fits in 8k tokens. You can find out here
The entire conversation is over 60,000 characters according to wc. OpenAI’s tool won’t even let me compute the tokens if I paste more than 50k (?) characters, but when I deleted some of it, it gave me a value of >18,000 tokens.
I’m not sure if/when ChatGPT starts to forgot part of the chat history (drops out of the context window) but it still seemed to remember the first file after long, winding discussion.