Entailment Detection by Fine-tuning BERT


  • The model in this repository is fine-tuned on Google's encoder-decoder transformer-based model BERT.
  • New York University's Multi-NLI dataset is used for fine-tuning.
  • Accuracy achieved: ~74%

    image/png

  • Notebook used for fine-tuning: here
  • N.B.: Due to computational resource constraints, only 11K samples are used for fine-tuning. There is room for accuracy improvement if a model is trained on all the 390K samples available in the dataset.
    Downloads last month
    0
    Inference Examples
    This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

    Model tree for ArghaKamalSamanta/ema_task_entailment

    Adapter
    (1)
    this model

    Dataset used to train ArghaKamalSamanta/ema_task_entailment