system
Update README.md 5b5436f
1
---
2
language: en
3
thumbnail: 
4
tags:
5
- pytorch
6
- text-classification
7
datasets:
8
- MNLI
9
---
10
11
# distilbert-base-uncased finetuned on MNLI
12
13
## Model Details and Training Data
14
15
We used the pretrained model from [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) and finetuned it on [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) dataset. 
16
17
The training parameters were kept the same as [Devlin et al., 2019](https://arxiv.org/abs/1810.04805) (learning rate = 2e-5, training epochs = 3, max_sequence_len = 128 and batch_size = 32).
18
19
## Evaluation Results
20
21
The evaluation results are mentioned in the table below.
22
23
| Test Corpus | Accuracy |
24
|:---:|:---------:|
25
| Matched | 0.8223 |
26
| Mismatched | 0.8216 |
27
28