tlapusan commited on
Commit
ef6aef1
·
1 Parent(s): 4e64a1a

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -8
README.md CHANGED
@@ -2,8 +2,6 @@
2
  license: apache-2.0
3
  tags:
4
  - generated_from_trainer
5
- datasets:
6
- - imdb
7
  model-index:
8
  - name: distilbert-base-uncased-finetuned-imdb
9
  results: []
@@ -14,9 +12,9 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  # distilbert-base-uncased-finetuned-imdb
16
 
17
- This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the imdb dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 2.4028
20
 
21
  ## Model description
22
 
@@ -48,14 +46,14 @@ The following hyperparameters were used during training:
48
 
49
  | Training Loss | Epoch | Step | Validation Loss |
50
  |:-------------:|:-----:|:----:|:---------------:|
51
- | 2.6323 | 1.0 | 313 | 2.4334 |
52
- | 2.5176 | 2.0 | 626 | 2.3852 |
53
- | 2.4864 | 3.0 | 939 | 2.3920 |
54
 
55
 
56
  ### Framework versions
57
 
58
- - Transformers 4.26.0
59
  - Pytorch 1.13.1+cu116
60
  - Datasets 2.9.0
61
  - Tokenizers 0.13.2
 
2
  license: apache-2.0
3
  tags:
4
  - generated_from_trainer
 
 
5
  model-index:
6
  - name: distilbert-base-uncased-finetuned-imdb
7
  results: []
 
12
 
13
  # distilbert-base-uncased-finetuned-imdb
14
 
15
+ This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 2.1639
18
 
19
  ## Model description
20
 
 
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:----:|:---------------:|
49
+ | 2.7695 | 1.0 | 90 | 2.3614 |
50
+ | 2.3627 | 2.0 | 180 | 2.1959 |
51
+ | 2.227 | 3.0 | 270 | 2.1313 |
52
 
53
 
54
  ### Framework versions
55
 
56
+ - Transformers 4.26.1
57
  - Pytorch 1.13.1+cu116
58
  - Datasets 2.9.0
59
  - Tokenizers 0.13.2