Forturne commited on
Commit
f4430bd
1 Parent(s): 71ac7c9

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -5
README.md CHANGED
@@ -1,5 +1,6 @@
1
  ---
2
  license: cc-by-sa-4.0
 
3
  tags:
4
  - generated_from_trainer
5
  datasets:
@@ -16,7 +17,7 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [klue/bert-base](https://huggingface.co/klue/bert-base) on the klue dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 1.2016
20
 
21
  ## Model description
22
 
@@ -36,8 +37,8 @@ More information needed
36
 
37
  The following hyperparameters were used during training:
38
  - learning_rate: 2e-05
39
- - train_batch_size: 64
40
- - eval_batch_size: 64
41
  - seed: 42
42
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
  - lr_scheduler_type: linear
@@ -47,12 +48,12 @@ The following hyperparameters were used during training:
47
 
48
  | Training Loss | Epoch | Step | Validation Loss |
49
  |:-------------:|:-----:|:----:|:---------------:|
50
- | 1.7356 | 1.0 | 588 | 1.2016 |
51
 
52
 
53
  ### Framework versions
54
 
55
- - Transformers 4.30.2
56
  - Pytorch 2.0.1+cu118
57
  - Datasets 2.13.1
58
  - Tokenizers 0.13.3
 
1
  ---
2
  license: cc-by-sa-4.0
3
+ base_model: klue/bert-base
4
  tags:
5
  - generated_from_trainer
6
  datasets:
 
17
 
18
  This model is a fine-tuned version of [klue/bert-base](https://huggingface.co/klue/bert-base) on the klue dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 1.0738
21
 
22
  ## Model description
23
 
 
37
 
38
  The following hyperparameters were used during training:
39
  - learning_rate: 2e-05
40
+ - train_batch_size: 16
41
+ - eval_batch_size: 16
42
  - seed: 42
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
 
48
 
49
  | Training Loss | Epoch | Step | Validation Loss |
50
  |:-------------:|:-----:|:----:|:---------------:|
51
+ | 1.1427 | 1.0 | 2350 | 1.0738 |
52
 
53
 
54
  ### Framework versions
55
 
56
+ - Transformers 4.31.0
57
  - Pytorch 2.0.1+cu118
58
  - Datasets 2.13.1
59
  - Tokenizers 0.13.3