Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

any qery gives response This model's maximum context length is 16385 tokens. [...] #427

Open
sh-cau opened this issue Apr 11, 2024 · 2 comments

Comments

@sh-cau
Copy link

sh-cau commented Apr 11, 2024

The full error reads
This model's maximum context length is 16385 tokens. However, your messages resulted in 16680 tokens. Please reduce the length of the messages.

This comes for any request (that previously could be executed successfully).

I tried setting the max_tokens to 20000 or some other arbitrary value but to no avail.
Using chatgpt CLI I'm having no problems with the same queries.

What am I missing?

@hrllk
Copy link

hrllk commented May 20, 2024

would you try enter on normal mode

@rafaelleru
Copy link

rafaelleru commented Jun 4, 2024

same here, I wrotte a comment, entered normal mode and tried to run ChatGPTCompleteCode but I always get the error.

It does not happens when I try to run it in a new file, so I am gessing complete code is sending to much context in the request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants