You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
model_cache_dir=r"C:\model_cache"model_name="Qwen/Qwen2.5-7B-Instruct"tokenizer=AutoTokenizer.from_pretrained(model_name, cache_dir=model_cache_dir)
quantization_config=BitsAndBytesConfig(
load_in_8bit=True
)
model=AutoModelForCausalLM.from_pretrained(
model_name,
cache_dir=model_cache_dir,
quantization_config=quantization_config, # 使用新的参数传入方式进行量化配置device_map='auto'
)
.................
device=torch.device("cuda:0")
messages= [
{"role": "system", "content": "You are a translator who can only translate the content I provide and won't have other chatting ideas.Only the translation content will be returned, no prompt.If it cannot be translated or there are translation errors, please return the original content intact without changing it."},
{"role": "user", "content": f"Translate into {input_str}: {input_text}"}
]
text=tokenizer.apply_chat_template(
messages,
tokenize=False,
add_generation_prompt=True
)
inputs=tokenizer([text], return_tensors="pt").to(device)
outputs=model.generate(
**inputs,
max_new_tokens=512#,max_length=10000
)
decoded_output=tokenizer.decode(outputs[0])
我的问题是,有计划完善 语言翻译 这方面吗?
The text was updated successfully, but these errors were encountered:
Model Series
Qwen2.5
What are the models used?
Qwen2.5-7B-Instruct
What is the scenario where the problem happened?
翻译
Is this badcase known and can it be solved using avaiable techniques?
Information about environment
操作系统:win10
Python 版本:Python 3.11
GPU:2080ti
NVIDIA 驱动:NVIDIA-SMI 560.94 Driver Version: 560.94
CUDA 编译器:12.6
Description
我用下面代码测试了 Qwen2.5-7B-Instruct 翻译看像不像你们说的那么牛,结果有些都翻译不出来,下面的文本都是随便拿一些游戏的测试的
德语:
意大利语:
.......还有很多我就不测试了
代码是:
我的问题是,有计划完善 语言翻译 这方面吗?
The text was updated successfully, but these errors were encountered: