vinaykudari commited on
Commit
68a256d
1 Parent(s): 7d0eb9d

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -4
README.md CHANGED
@@ -15,6 +15,8 @@ should probably proofread and complete it, then remove this comment. -->
15
  # t5-ft-billsum
16
 
17
  This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the billsum dataset.
 
 
18
 
19
  ## Model description
20
 
@@ -34,19 +36,28 @@ More information needed
34
 
35
  The following hyperparameters were used during training:
36
  - learning_rate: 2e-05
37
- - train_batch_size: 8
38
- - eval_batch_size: 8
39
  - seed: 42
40
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
41
  - lr_scheduler_type: linear
42
- - num_epochs: 1
43
  - mixed_precision_training: Native AMP
44
 
45
  ### Training results
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:----:|:---------------:|
49
- | No log | 1.0 | 124 | 2.6302 |
 
 
 
 
 
 
 
 
 
50
 
51
 
52
  ### Framework versions
 
15
  # t5-ft-billsum
16
 
17
  This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the billsum dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 2.2752
20
 
21
  ## Model description
22
 
 
36
 
37
  The following hyperparameters were used during training:
38
  - learning_rate: 2e-05
39
+ - train_batch_size: 10
40
+ - eval_batch_size: 10
41
  - seed: 42
42
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
  - lr_scheduler_type: linear
44
+ - num_epochs: 10
45
  - mixed_precision_training: Native AMP
46
 
47
  ### Training results
48
 
49
  | Training Loss | Epoch | Step | Validation Loss |
50
  |:-------------:|:-----:|:----:|:---------------:|
51
+ | No log | 1.0 | 99 | 2.6250 |
52
+ | No log | 2.0 | 198 | 2.4587 |
53
+ | No log | 3.0 | 297 | 2.3865 |
54
+ | No log | 4.0 | 396 | 2.3431 |
55
+ | No log | 5.0 | 495 | 2.3226 |
56
+ | 2.7775 | 6.0 | 594 | 2.3019 |
57
+ | 2.7775 | 7.0 | 693 | 2.2882 |
58
+ | 2.7775 | 8.0 | 792 | 2.2802 |
59
+ | 2.7775 | 9.0 | 891 | 2.2764 |
60
+ | 2.7775 | 10.0 | 990 | 2.2752 |
61
 
62
 
63
  ### Framework versions