Model Card for distilbert-mnli

Model Details

Model Description

A fine-tuned version of distilbert/distilbert-base-uncased using the nyu-mll/multi_nli dataset.

  • Developed by: Karl Weinmeister
  • Language(s) (NLP): en
  • License: apache-2.0
  • Finetuned from model [optional]: distilbert/distilbert-base-uncased

Training Hyperparameters

  • Training regime: The model was trained for 5 epochs with batch size 128.
Downloads last month
8
Safetensors
Model size
67M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for kweinmeister/distilbert-mnli

Finetuned
(8565)
this model
Merges
1 model

Dataset used to train kweinmeister/distilbert-mnli

Evaluation results