Added optional input embeddings to bypass NeoBERT.encoder
#8
by
Lolalb
- opened
No description provided.
Lolalb
changed pull request status to
merged
@Lolalb In your latest change, you have added inputs_embed param to the class, but we also need to update NeoBERTForSequenceClassification as in its forward method we call base model's forward method which is now getting boolean value for the inputs_embed field !
def forward(
self,
input_ids: Optional[torch.Tensor] = None,
position_ids: torch.Tensor = None,
max_seqlen: int = None,
cu_seqlens: torch.Tensor = None,
attention_mask: torch.Tensor = None,
output_hidden_states: bool = False,
output_attentions: bool = False,
labels: Optional[torch.Tensor] = None,
return_dict: Optional[bool] = None,
):
output = self.model.forward(
input_ids,
position_ids,
max_seqlen,
cu_seqlens,
attention_mask,
output_hidden_states, <--- missing param here !
output_attentions,
)
Because of this classification task always fails with
if (input_ids is None) ^ (inputs_embeds is not None):
raise ValueError("You must specify exactly one of input_ids or inputs_embeds")