Model Card for distilbert-snli
Model Details
Model Description
A fine-tuned version of distilbert/distilbert-base-uncased
using the stanford-nlp/snli
dataset.
- Developed by: Karl Weinmeister
- Language(s) (NLP): en
- License: apache-2.0
- Finetuned from model [optional]: distilbert/distilbert-base-uncased
Training Hyperparameters
- Training regime: The model was trained for 5 epochs with batch size 128.
- Downloads last month
- 13
Inference API (serverless) does not yet support transformers models for this pipeline type.
Model tree for kweinmeister/distilbert-snli
Base model
distilbert/distilbert-base-uncased