Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

执行cli_demo.py或者web_demo.py,均报错“AttributeError: 'bool' object has no attribute 'encode' ” #424

Open
Leo20100307 opened this issue Nov 2, 2024 · 1 comment

Comments

@Leo20100307
Copy link

(baichuan2-13b) root@ubuntu:~/Baichuan2# python cli_demo.py
init model ...
You are using an old version of the checkpointing format that is deprecated (We will also silently ignore gradient_checkpointing_kwargs in case you passed it).Please update to the new format on your modeling file. To use the new format, you need to completely remove the definition of the method _set_gradient_checkpointing in your model.
Loading checkpoint shards: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3/3 [00:13<00:00, 4.51s/it]
We've detected an older driver with an RTX 4000 series GPU. These drivers have issues with P2P. This can affect the multi-gpu inference when using accelerate device_map.Please make sure to update your driver to the latest version which resolves this.
欢迎使用百川大模型,输入进行对话,vim 多行输入,clear 清空历史,CTRL+C 中断生成,stream 开关流式生成,exit 结束。

用户:你好

Baichuan 2:Traceback (most recent call last):
File "/root/Baichuan2/cli_demo.py", line 87, in
main()
File "/root/Baichuan2/cli_demo.py", line 69, in main
for response in model.chat(tokenizer, messages, stream=True):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/huggingface/modules/transformers_modules/Baichuan2-13B-Chat/modeling_baichuan.py", line 816, in chat
input_ids = build_chat_input(self, tokenizer, messages, generation_config.max_new_tokens)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/huggingface/modules/transformers_modules/Baichuan2-13B-Chat/generation_utils.py", line 31, in build_chat_input
system_tokens = tokenizer.encode(system)
^^^^^^^^^^^^^^^^
AttributeError: 'bool' object has no attribute 'encode'

@Leo20100307
Copy link
Author

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant