Lautaro's picture
Adding doc
a4c2c8d
metadata
pipeline_tag: zero-shot-classification
tags:
  - zero-shot-classification
  - nli
language:
  - es
datasets:
  - hackathon-pln-es/nli-es
widget:
  - text: >-
      El autor se perfila, a los 50 años de su muerte, como uno de los grandes
      de su siglo
    candidate_labels: cultura, sociedad, economia, salud, deportes

A zero-shot classifier based on bertin-roberta-base-finetuning-esnli

Usage (HuggingFace Transformers)

Without sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.

from transformers import pipeline
classifier = pipeline("zero-shot-classification", 
                       model="hackathon-pln-es/bertin-roberta-base-zeroshot-esnli")

classifier(
    "El autor se perfila, a los 50 años de su muerte, como uno de los grandes de su siglo",
    candidate_labels=["cultura", "sociedad", "economia", "salud", "deportes"],
    hypothesis_template="Este ejemplo es {}."
)

The hypothesis_template parameter is important and should be in Spanish. In the widget on the right, this parameter is set to its default value: "This example is {}.", so different results are expected.

Training

Dataset

We used a collection of datasets of Natural Language Inference as training data:

  • ESXNLI, only the part in spanish
  • SNLI, automatically translated
  • MultiNLI, automatically translated

The whole dataset used is available here.

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 514, 'do_lower_case': False}) with Transformer model: RobertaModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)

Authors