Problems with questions too long

#16
by AxelPATRON - opened

With this code:

model_name = 'facebook/blenderbot-400M-distill'
tokenizer = BlenderbotTokenizer.from_pretrained(model_name)
model = BlenderbotForConditionalGeneration.from_pretrained(model_name)
chat_pipeline = pipeline('conversational', model=model, tokenizer=tokenizer)
question = "my question"
conversation = Conversation(question)
chat_pipeline([conversation],max_length=512)
response = conversation.generated_responses[-1]

I get "sequence length is longer than the specified maximum sequence length for this model (146 > 128). Running this sequence through the model will result in indexing errors" when my question is too long, I tried to changed the max_length in chat_pipeline but I'm not sure it's there that I should change the parameters, How could I change the code to be able to ask huge questions?

I changed it to "config = BlenderbotConfig.from_pretrained(model_name, max_position_embeddings=1024)
tokenizer = BlenderbotTokenizer.from_pretrained(model_name)
model = BlenderbotForConditionalGeneration.from_pretrained(model_name, ignore_mismatched_sizes=True, config = config)"

But the responses are no longer even intelligble ( I switch to the blenderbot-3B btw but it's the same with both)

Sign up or log in to comment