nishita commited on
Commit
1ce3094
1 Parent(s): 4cf090f

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -8
README.md CHANGED
@@ -1,5 +1,5 @@
1
  ---
2
- license: apache-2.0
3
  tags:
4
  - generated_from_trainer
5
  model-index:
@@ -12,7 +12,7 @@ should probably proofread and complete it, then remove this comment. -->
12
 
13
  # results
14
 
15
- This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
16
 
17
  ## Model description
18
 
@@ -32,8 +32,8 @@ More information needed
32
 
33
  The following hyperparameters were used during training:
34
  - learning_rate: 2e-05
35
- - train_batch_size: 16
36
- - eval_batch_size: 16
37
  - seed: 42
38
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
39
  - lr_scheduler_type: linear
@@ -41,14 +41,14 @@ The following hyperparameters were used during training:
41
 
42
  ### Training results
43
 
44
- | Training Loss | Epoch | Step | Validation Loss |
45
- |:-------------:|:-----:|:----:|:---------------:|
46
- | No log | 1.0 | 7 | 2.8480 |
47
 
48
 
49
  ### Framework versions
50
 
51
  - Transformers 4.20.1
52
- - Pytorch 1.12.0+cpu
53
  - Datasets 2.3.2
54
  - Tokenizers 0.12.1
 
1
  ---
2
+ license: mit
3
  tags:
4
  - generated_from_trainer
5
  model-index:
 
12
 
13
  # results
14
 
15
+ This model is a fine-tuned version of [gagan3012/k2t](https://huggingface.co/gagan3012/k2t) on an unknown dataset.
16
 
17
  ## Model description
18
 
 
32
 
33
  The following hyperparameters were used during training:
34
  - learning_rate: 2e-05
35
+ - train_batch_size: 32
36
+ - eval_batch_size: 32
37
  - seed: 42
38
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
39
  - lr_scheduler_type: linear
 
41
 
42
  ### Training results
43
 
44
+ | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
45
+ |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
46
+ | No log | 1.0 | 4 | 0.5771 | 64.8908 | 45.4389 | 57.3295 | 58.0193 | 17.72 |
47
 
48
 
49
  ### Framework versions
50
 
51
  - Transformers 4.20.1
52
+ - Pytorch 1.12.0+cu113
53
  - Datasets 2.3.2
54
  - Tokenizers 0.12.1