You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
chatGPTBox currently can't use OpenAI's o1 series reasoning models due to some API spec differences. One key issue is that the max_tokens parameter has been deprecated and replaced by max_completion_tokens, which leads to incompatibility with the new models.
Describe the solution you'd like
To support the o1 series models, the following changes and considerations should be made based on the OpenAI API and beta limitations:
Replace max_tokens with max_completion_tokens.
Limitations during beta:
Only user and assistant message types are allowed (no system messages).
Streaming, tools, and function calling are not supported.
Parameters such as temperature, top_p, and n are fixed at 1.
presence_penalty and frequency_penalty are fixed at 0.
The text was updated successfully, but these errors were encountered:
Thanks for pointing that out! I should probably update the title to be more precise. This issue is specifically about accessing the o1 series models via the API. When using the web, these issues don't occur because the access method is different.
Is your feature request related to a problem? Please describe.
chatGPTBox
currently can't use OpenAI's o1 series reasoning models due to some API spec differences. One key issue is that themax_tokens
parameter has been deprecated and replaced bymax_completion_tokens
, which leads to incompatibility with the new models.Describe the solution you'd like
To support the o1 series models, the following changes and considerations should be made based on the OpenAI API and beta limitations:
max_tokens
withmax_completion_tokens
.user
andassistant
message types are allowed (nosystem
messages).temperature
,top_p
, andn
are fixed at 1.presence_penalty
andfrequency_penalty
are fixed at 0.The text was updated successfully, but these errors were encountered: