Zeta611 commited on
Commit
183a2b3
1 Parent(s): 68005e3

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +20 -14
README.md CHANGED
@@ -16,9 +16,9 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [facebook/nllb-200-distilled-1.3B](https://huggingface.co/facebook/nllb-200-distilled-1.3B) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 3.4596
20
- - Bleu: 2.6322
21
- - Gen Len: 5.8199
22
 
23
  ## Model description
24
 
@@ -43,23 +43,29 @@ The following hyperparameters were used during training:
43
  - seed: 42
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: linear
46
- - num_epochs: 10
47
  - mixed_precision_training: Native AMP
48
 
49
  ### Training results
50
 
51
  | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
52
  |:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|
53
- | No log | 1.0 | 31 | 4.9810 | 0.0 | 9.4099 |
54
- | No log | 2.0 | 62 | 4.4936 | 0.0 | 8.7516 |
55
- | No log | 3.0 | 93 | 4.0142 | 1.017 | 7.1801 |
56
- | No log | 4.0 | 124 | 3.6795 | 1.7949 | 6.2733 |
57
- | No log | 5.0 | 155 | 3.5729 | 2.5943 | 5.9255 |
58
- | No log | 6.0 | 186 | 3.5270 | 2.6178 | 5.882 |
59
- | No log | 7.0 | 217 | 3.4960 | 2.92 | 5.8075 |
60
- | No log | 8.0 | 248 | 3.4751 | 2.6168 | 5.8447 |
61
- | No log | 9.0 | 279 | 3.4634 | 2.6063 | 5.8385 |
62
- | No log | 10.0 | 310 | 3.4596 | 2.6322 | 5.8199 |
 
 
 
 
 
 
63
 
64
 
65
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [facebook/nllb-200-distilled-1.3B](https://huggingface.co/facebook/nllb-200-distilled-1.3B) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 3.2720
20
+ - Bleu: 3.3083
21
+ - Gen Len: 5.882
22
 
23
  ## Model description
24
 
 
43
  - seed: 42
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: linear
46
+ - num_epochs: 16
47
  - mixed_precision_training: Native AMP
48
 
49
  ### Training results
50
 
51
  | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
52
  |:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|
53
+ | No log | 1.0 | 31 | 4.9650 | 0.0 | 9.3975 |
54
+ | No log | 2.0 | 62 | 4.4353 | 0.1522 | 8.6957 |
55
+ | No log | 3.0 | 93 | 3.8967 | 1.2792 | 6.8137 |
56
+ | No log | 4.0 | 124 | 3.6053 | 2.6004 | 6.0062 |
57
+ | No log | 5.0 | 155 | 3.5239 | 2.9339 | 5.8571 |
58
+ | No log | 6.0 | 186 | 3.4692 | 2.6031 | 5.8261 |
59
+ | No log | 7.0 | 217 | 3.4244 | 2.6536 | 5.795 |
60
+ | No log | 8.0 | 248 | 3.3865 | 2.6445 | 5.8509 |
61
+ | No log | 9.0 | 279 | 3.3555 | 2.5482 | 5.9193 |
62
+ | No log | 10.0 | 310 | 3.3325 | 3.087 | 5.913 |
63
+ | No log | 11.0 | 341 | 3.3141 | 3.3511 | 5.9006 |
64
+ | No log | 12.0 | 372 | 3.2986 | 3.3511 | 5.8944 |
65
+ | No log | 13.0 | 403 | 3.2871 | 3.9871 | 5.8758 |
66
+ | No log | 14.0 | 434 | 3.2787 | 3.3083 | 5.882 |
67
+ | No log | 15.0 | 465 | 3.2738 | 3.3083 | 5.882 |
68
+ | No log | 16.0 | 496 | 3.2720 | 3.3083 | 5.882 |
69
 
70
 
71
  ### Framework versions