You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The full error reads This model's maximum context length is 16385 tokens. However, your messages resulted in 16680 tokens. Please reduce the length of the messages.
This comes for any request (that previously could be executed successfully).
I tried setting the max_tokens to 20000 or some other arbitrary value but to no avail.
Using chatgpt CLI I'm having no problems with the same queries.
What am I missing?
The text was updated successfully, but these errors were encountered:
The full error reads
This model's maximum context length is 16385 tokens. However, your messages resulted in 16680 tokens. Please reduce the length of the messages.
This comes for any request (that previously could be executed successfully).
I tried setting the
max_tokens
to 20000 or some other arbitrary value but to no avail.Using chatgpt CLI I'm having no problems with the same queries.
What am I missing?
The text was updated successfully, but these errors were encountered: