Can't get model to run with example on model card

#2
by erlingh - opened

I'm having some issues getting the model to run. The example usage does not work. Aside from the import error (from modeling_nor*bert* not from modeling_nor*t5*) there seems to be some wrong with the script to interface with hugging face standards.

I get the message AttributeError: 'function' has no attribute 'forward'. I believe this is because the get_encoder function in NorT5Model returns the get_encoder_output function, not an object. I've tried to write an encoder wrapper around the Encoder class which implements the get_encoder_output function as its forward function, but then I get an issue with the shape of input_shape, so maybe this is the wrong way around. Any help?

Here is the code I'm running, copied from the example.

import torch
from transformers import AutoTokenizer
from modeling_nort5 import NorT5ForConditionalGeneration

tokenizer = AutoTokenizer.from_pretrained("path/to/model")
model = NorT5ForConditionalGeneration.from_pretrained("path/to/model")

#LANGUAGE MODELING

sentence = "Brukseksempel: Elektrisk oppvarming. Definisjonen på ordet oppvarming er[MASK_0]."

input_tensor = tokenizer(sentence, return_tensors="pt").input_ids
output_tensor = model.generate(input_tensor, decoder_start_token_id=7, eos_token_id=8)
print(tokenizer.decode(output_tensor.squeeze(), skip_special_tokens=True))
# should output:  å varme opp

Hei, I have tried the same thing and I'm getting the same error message.
AttributeError: 'function' object has no attribute 'forward`

Language Technology Group (University of Oslo) org

Hi, thanks a lot for raising this issue! There was indeed a compatibility problem with the newest huggingface version, it should be fixed now :)

davda54 changed discussion status to closed

Sign up or log in to comment