mpsb00 commited on
Commit
ef2126d
1 Parent(s): 4954ee5

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -8
README.md CHANGED
@@ -16,9 +16,9 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [prajjwal1/bert-tiny](https://huggingface.co/prajjwal1/bert-tiny) on the lex_glue dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.2669
20
- - Macro-f1: 0.4193
21
- - Micro-f1: 0.5573
22
 
23
  ## Model description
24
 
@@ -43,17 +43,15 @@ The following hyperparameters were used during training:
43
  - seed: 42
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: linear
46
- - num_epochs: 2
47
  - mixed_precision_training: Native AMP
48
 
49
  ### Training results
50
 
51
  | Training Loss | Epoch | Step | Validation Loss | Macro-f1 | Micro-f1 |
52
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|
53
- | 0.2127 | 0.44 | 500 | 0.3104 | 0.3108 | 0.4418 |
54
- | 0.1808 | 0.89 | 1000 | 0.2961 | 0.3452 | 0.5022 |
55
- | 0.1762 | 1.33 | 1500 | 0.2807 | 0.3945 | 0.5273 |
56
- | 0.16 | 1.78 | 2000 | 0.2669 | 0.4193 | 0.5573 |
57
 
58
 
59
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [prajjwal1/bert-tiny](https://huggingface.co/prajjwal1/bert-tiny) on the lex_glue dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.2487
20
+ - Macro-f1: 0.4052
21
+ - Micro-f1: 0.5660
22
 
23
  ## Model description
24
 
 
43
  - seed: 42
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: linear
46
+ - num_epochs: 1
47
  - mixed_precision_training: Native AMP
48
 
49
  ### Training results
50
 
51
  | Training Loss | Epoch | Step | Validation Loss | Macro-f1 | Micro-f1 |
52
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|
53
+ | 0.2056 | 0.44 | 500 | 0.2846 | 0.3335 | 0.4763 |
54
+ | 0.1698 | 0.89 | 1000 | 0.2487 | 0.4052 | 0.5660 |
 
 
55
 
56
 
57
  ### Framework versions