You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Metagpt exited halfway because of an exception due to max_tokens setting.
Bug solved method
Environment information
LLM type and model name: yi-lightning
System version: ubuntu 22.04
Python version: 3.11
MetaGPT version or branch: main
packages version: 0.8.1
installation method: pip
packages version:
installation method:
Screenshots or logs
2024-11-22 17:18:08.597 | INFO | metagpt.actions.write_code_review:run:185 - Code review and rewrite main.py: 1/2 | len(iterative_code)=12485, len(self.i_context.code_doc.content)=12485
2024-11-22 17:18:31.001 | WARNING | metagpt.utils.common:wrapper:673 - There is a exception in role's execution, in order to resume, we delete the newest role communication message in the role's memory.
2024-11-22 17:18:31.002 | ERROR | metagpt.utils.common:wrapper:655 - Exception occurs, start to serialize the project, exp:
Traceback (most recent call last):
File "/root/anaconda3/envs/metagpt/lib/python3.11/site-packages/metagpt/utils/common.py", line 650, in wrapper
result = await func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/anaconda3/envs/metagpt/lib/python3.11/site-packages/metagpt/team.py", line 134, in run
await self.env.run()
openai.BadRequestError: Error code: 400 - {'error': {'code': 'bad_request', 'message': 'The total number of tokens in the prompt and the max_tokens must be less than or equal to 16000. The prompt has 13284 tokens and the max_tokens is 4096 tokens.', 'type': 'invalid_request_error', 'param': None}}
The text was updated successfully, but these errors were encountered:
Bug description
Metagpt exited halfway because of an exception due to max_tokens setting.
Bug solved method
Environment information
Screenshots or logs
2024-11-22 17:18:08.597 | INFO | metagpt.actions.write_code_review:run:185 - Code review and rewrite main.py: 1/2 | len(iterative_code)=12485, len(self.i_context.code_doc.content)=12485
2024-11-22 17:18:31.001 | WARNING | metagpt.utils.common:wrapper:673 - There is a exception in role's execution, in order to resume, we delete the newest role communication message in the role's memory.
2024-11-22 17:18:31.002 | ERROR | metagpt.utils.common:wrapper:655 - Exception occurs, start to serialize the project, exp:
Traceback (most recent call last):
File "/root/anaconda3/envs/metagpt/lib/python3.11/site-packages/metagpt/utils/common.py", line 650, in wrapper
result = await func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/anaconda3/envs/metagpt/lib/python3.11/site-packages/metagpt/team.py", line 134, in run
await self.env.run()
openai.BadRequestError: Error code: 400 - {'error': {'code': 'bad_request', 'message': 'The total number of tokens in the prompt and the max_tokens must be less than or equal to 16000. The prompt has 13284 tokens and the max_tokens is 4096 tokens.', 'type': 'invalid_request_error', 'param': None}}
The text was updated successfully, but these errors were encountered: