Model Card for distilbert-base-uncased

Model Details

Model Description

A fine-tuned version of distilbert/distilbert-base-uncased using the nyu-mll/multi_nli dataset.

  • Developed by: Karl Weinmeister
  • Language(s) (NLP): en
  • License: apache-2.0
  • Finetuned from model [optional]: distilbert/distilbert-base-uncased

Training Hyperparameters

  • Training regime: The model was trained for 5 epochs with batch size 128.
Downloads last month
208
Safetensors
Model size
67M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) does not yet support transformers models for this pipeline type.

Model tree for kweinmeister/distilbert-mnli

Finetuned
(6830)
this model
Merges
2 models

Dataset used to train kweinmeister/distilbert-mnli

Evaluation results