Locally Not Working

#40
by tynsharp - opened

Hello, I am installing c4ai-command-r-plus locally, but whenever I run the following code, it only gives the warning 'Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained' and does not produce any other output. How can I solve this issue?

PYTHON CODE;

from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "CohereForAI/c4ai-command-r-plus"

tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)

prompt = "Hello!"
generated_text = model.generate(tokenizer.encode(prompt, return_tensors="pt"))
print(generated_text[0])

Anyone help me :(

先生,没有运行过程,我很难帮助你。

就比如我目前要死了,可是我只告诉你我要死了,没有说明我为什么要死,你就无法帮助我。

[",我草泥马的,你在做什么马?"]

Sign up or log in to comment