leadawon commited on
Commit
a44f284
1 Parent(s): a65d7fb

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +120 -0
README.md ADDED
@@ -0,0 +1,120 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - generated_from_trainer
4
+ model-index:
5
+ - name: jeju-ko-nmt-v6-word-v1
6
+ results: []
7
+ ---
8
+
9
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
10
+ should probably proofread and complete it, then remove this comment. -->
11
+
12
+ # jeju-ko-nmt-v6-word-v1
13
+
14
+ This model is a fine-tuned version of [leadawon/jeju-ko-nmt-v6](https://huggingface.co/leadawon/jeju-ko-nmt-v6) on an unknown dataset.
15
+ It achieves the following results on the evaluation set:
16
+ - Loss: 0.6390
17
+
18
+ ## Model description
19
+
20
+ More information needed
21
+
22
+ ## Intended uses & limitations
23
+
24
+ More information needed
25
+
26
+ ## Training and evaluation data
27
+
28
+ More information needed
29
+
30
+ ## Training procedure
31
+
32
+ ### Training hyperparameters
33
+
34
+ The following hyperparameters were used during training:
35
+ - learning_rate: 5e-06
36
+ - train_batch_size: 128
37
+ - eval_batch_size: 128
38
+ - seed: 42
39
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
40
+ - lr_scheduler_type: linear
41
+ - lr_scheduler_warmup_ratio: 0.1
42
+ - num_epochs: 2
43
+ - mixed_precision_training: Native AMP
44
+
45
+ ### Training results
46
+
47
+ | Training Loss | Epoch | Step | Validation Loss |
48
+ |:-------------:|:-----:|:-----:|:---------------:|
49
+ | 0.8194 | 0.03 | 500 | 0.7499 |
50
+ | 0.7814 | 0.06 | 1000 | 0.7211 |
51
+ | 0.7559 | 0.09 | 1500 | 0.7036 |
52
+ | 0.7441 | 0.12 | 2000 | 0.7017 |
53
+ | 0.741 | 0.15 | 2500 | 0.6865 |
54
+ | 0.7329 | 0.18 | 3000 | 0.6825 |
55
+ | 0.7341 | 0.21 | 3500 | 0.6754 |
56
+ | 0.7358 | 0.24 | 4000 | 0.6671 |
57
+ | 0.723 | 0.28 | 4500 | 0.6641 |
58
+ | 0.7229 | 0.31 | 5000 | 0.6639 |
59
+ | 0.726 | 0.34 | 5500 | 0.6642 |
60
+ | 0.7219 | 0.37 | 6000 | 0.6577 |
61
+ | 0.7213 | 0.4 | 6500 | 0.6552 |
62
+ | 0.7192 | 0.43 | 7000 | 0.6627 |
63
+ | 0.7154 | 0.46 | 7500 | 0.6582 |
64
+ | 0.7108 | 0.49 | 8000 | 0.6570 |
65
+ | 0.7154 | 0.52 | 8500 | 0.6530 |
66
+ | 0.7178 | 0.55 | 9000 | 0.6601 |
67
+ | 0.7046 | 0.58 | 9500 | 0.6480 |
68
+ | 0.7127 | 0.61 | 10000 | 0.6507 |
69
+ | 0.6985 | 0.64 | 10500 | 0.6493 |
70
+ | 0.7093 | 0.67 | 11000 | 0.6500 |
71
+ | 0.7055 | 0.7 | 11500 | 0.6469 |
72
+ | 0.7061 | 0.73 | 12000 | 0.6534 |
73
+ | 0.7018 | 0.76 | 12500 | 0.6458 |
74
+ | 0.7027 | 0.8 | 13000 | 0.6424 |
75
+ | 0.7068 | 0.83 | 13500 | 0.6387 |
76
+ | 0.7044 | 0.86 | 14000 | 0.6441 |
77
+ | 0.7028 | 0.89 | 14500 | 0.6413 |
78
+ | 0.6992 | 0.92 | 15000 | 0.6359 |
79
+ | 0.7007 | 0.95 | 15500 | 0.6479 |
80
+ | 0.6887 | 0.98 | 16000 | 0.6416 |
81
+ | 0.7053 | 1.01 | 16500 | 0.6381 |
82
+ | 0.6795 | 1.04 | 17000 | 0.6391 |
83
+ | 0.6812 | 1.07 | 17500 | 0.6392 |
84
+ | 0.6826 | 1.1 | 18000 | 0.6405 |
85
+ | 0.6775 | 1.13 | 18500 | 0.6391 |
86
+ | 0.6798 | 1.16 | 19000 | 0.6378 |
87
+ | 0.6895 | 1.19 | 19500 | 0.6359 |
88
+ | 0.687 | 1.22 | 20000 | 0.6364 |
89
+ | 0.6887 | 1.25 | 20500 | 0.6357 |
90
+ | 0.6758 | 1.28 | 21000 | 0.6356 |
91
+ | 0.6732 | 1.32 | 21500 | 0.6368 |
92
+ | 0.6832 | 1.35 | 22000 | 0.6381 |
93
+ | 0.6805 | 1.38 | 22500 | 0.6347 |
94
+ | 0.6821 | 1.41 | 23000 | 0.6373 |
95
+ | 0.6828 | 1.44 | 23500 | 0.6401 |
96
+ | 0.678 | 1.47 | 24000 | 0.6402 |
97
+ | 0.6892 | 1.5 | 24500 | 0.6358 |
98
+ | 0.6855 | 1.53 | 25000 | 0.6339 |
99
+ | 0.6748 | 1.56 | 25500 | 0.6363 |
100
+ | 0.6734 | 1.59 | 26000 | 0.6361 |
101
+ | 0.6748 | 1.62 | 26500 | 0.6348 |
102
+ | 0.6812 | 1.65 | 27000 | 0.6355 |
103
+ | 0.6844 | 1.68 | 27500 | 0.6380 |
104
+ | 0.6794 | 1.71 | 28000 | 0.6393 |
105
+ | 0.6834 | 1.74 | 28500 | 0.6390 |
106
+ | 0.6843 | 1.77 | 29000 | 0.6411 |
107
+ | 0.6732 | 1.8 | 29500 | 0.6414 |
108
+ | 0.6758 | 1.84 | 30000 | 0.6395 |
109
+ | 0.6782 | 1.87 | 30500 | 0.6407 |
110
+ | 0.6787 | 1.9 | 31000 | 0.6402 |
111
+ | 0.6752 | 1.93 | 31500 | 0.6393 |
112
+ | 0.6769 | 1.96 | 32000 | 0.6392 |
113
+ | 0.6768 | 1.99 | 32500 | 0.6390 |
114
+
115
+
116
+ ### Framework versions
117
+
118
+ - Transformers 4.26.0
119
+ - Pytorch 1.13.1+cu116
120
+ - Tokenizers 0.13.2