LLAMA2 model get werid symbols when running on device mps

23 Views Asked by At

from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig,LlamaForCausalLM

model_id = "meta-llama/Llama-2-7b-chat-hf"


tokenizer = AutoTokenizer.from_pretrained(model_id, token=os.environ['HF_TOKEN'])
model = AutoModelForCausalLM.from_pretrained(model_id, device_map='auto', token=os.environ['HF_TOKEN'])

text = "Instruct: Quote: Imagination is more. From:"
device = 'mps'
model.to(device)
inputs = tokenizer(text, return_tensors="pt").to(device)
print(inputs)
outputs = model.generate(**inputs, max_new_tokens=20)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
print(outputs)

>>> Instruct: Quote: Imagination is more. From: RalphЉЪ,2ЪurlsO0\\.ЋO0OЉOO


I got weird symbols like "RalphЉЪ,2ЪurlsO0.ЋO0OЉOO" when running on mps, but works fine on cpu. I was running on a 128G m2 Ultra Mac studio.

1

There are 1 best solutions below

1
user23719417 On

It turns out the problem of anaconda, when switched to venv created by python, it works properly now