Error while running simple example PLEASE help
Hi Guys.
I have looked everywhere but cannot seem to resolve this.
My code is:
from transformers import AutoTokenizer, AutoModel
from transformers import BloomForTokenClassification
from transformers import BloomForTokenClassification
from transformers import BloomTokenizerFast
tokenizer = AutoTokenizer.from_pretrained("bigscience/bloom-560m")
model = AutoModel.from_pretrained("bigscience/bloom-560m")
prompt = "It was a dark and stormy night"
result_length = 50
inputs = tokenizer(prompt, return_tensors="pt")
tokenout = model.generate(inputs["input_ids"], max_length=result_length)[0]
I get the below error in the line: tokenout = model.generate(inputs["input_ids"], max_length=result_length)[0]
Exception has occurred: AttributeError
'BaseModelOutputWithPastAndCrossAttentions' object has no attribute 'logits'
File "C:\TF_Test\BLOOM\brewstory.py", line 13, in
tokenout = model.generate(inputs["input_ids"], max_length=result_length)[0]
Any suggestions are welcome. It seems to work if I use the GPT-2 model, just fine so it must be some additional parameters I'm not setting properly?
For generation, you want to use the model with the language modelling head 😇
from transformers import AutoTokenizer, AutoModelForCausalLM
from transformers import BloomForTokenClassification
from transformers import BloomForTokenClassification
from transformers import BloomTokenizerFast
tokenizer = AutoTokenizer.from_pretrained("bigscience/bloom-560m")
model = AutoModelForCausalLM.from_pretrained("bigscience/bloom-560m")
prompt = "It was a dark and stormy night"
result_length = 50
inputs = tokenizer(prompt, return_tensors="pt")
tokenout = model.generate(inputs["input_ids"], max_length=result_length)