Increase max_position_embeddings?
#5
by
schneeman
- opened
I have some longer texts that I'd like to embed and finding that I'm bumping against the default 512 size.
I'm trying to configure the model as so:
processor = AlignProcessor.from_pretrained("kakaobrain/align-base")
model = AlignModel.from_pretrained("kakaobrain/align-base")
model.config.text_config.max_position_embeddings = 2048
and finding that the 2048
is not adhered to, which tells me that I'm probably doing it wrong. Specifically, the error I get is:
embeddings = inputs_embeds + token_type_embeddings
if self.position_embedding_type == "absolute":
position_embeddings = self.position_embeddings(position_ids)
> embeddings += position_embeddings
E RuntimeError: The size of tensor a (704) must match the size of tensor b (512) at non-singleton dimension 1
Ultimately, I'd like to use the pretrained kakaobrain/align-base with a longer context size. Is this possible?
schneeman
changed discussion status to
closed