pocper1 commited on
Commit
a92d389
1 Parent(s): c5c5985

End of training

Browse files
Files changed (3) hide show
  1. README.md +133 -0
  2. generation_config.json +5 -0
  3. pytorch_model.bin +1 -1
README.md ADDED
@@ -0,0 +1,133 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: gpl-3.0
3
+ base_model: ckiplab/bert-base-chinese
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: bert_model-1
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # bert_model-1
15
+
16
+ This model is a fine-tuned version of [ckiplab/bert-base-chinese](https://huggingface.co/ckiplab/bert-base-chinese) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 1.4135
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 5e-05
38
+ - train_batch_size: 8
39
+ - eval_batch_size: 8
40
+ - seed: 42
41
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
+ - lr_scheduler_type: cosine
43
+ - lr_scheduler_warmup_steps: 100
44
+ - num_epochs: 1
45
+
46
+ ### Training results
47
+
48
+ | Training Loss | Epoch | Step | Validation Loss |
49
+ |:-------------:|:-----:|:----:|:---------------:|
50
+ | 2.3994 | 0.01 | 100 | 1.6845 |
51
+ | 1.8101 | 0.03 | 200 | 1.6963 |
52
+ | 1.7742 | 0.04 | 300 | 1.6679 |
53
+ | 1.8425 | 0.05 | 400 | 1.6657 |
54
+ | 1.8452 | 0.06 | 500 | 1.6369 |
55
+ | 1.8109 | 0.08 | 600 | 1.6471 |
56
+ | 1.8469 | 0.09 | 700 | 1.6350 |
57
+ | 1.7709 | 0.1 | 800 | 1.6302 |
58
+ | 1.7848 | 0.12 | 900 | 1.6346 |
59
+ | 1.7955 | 0.13 | 1000 | 1.6345 |
60
+ | 1.79 | 0.14 | 1100 | 1.6356 |
61
+ | 1.7655 | 0.16 | 1200 | 1.6116 |
62
+ | 1.7826 | 0.17 | 1300 | 1.6248 |
63
+ | 1.7651 | 0.18 | 1400 | 1.6262 |
64
+ | 1.7639 | 0.19 | 1500 | 1.6078 |
65
+ | 1.7743 | 0.21 | 1600 | 1.6105 |
66
+ | 1.7672 | 0.22 | 1700 | 1.5910 |
67
+ | 1.7054 | 0.23 | 1800 | 1.6060 |
68
+ | 1.6777 | 0.25 | 1900 | 1.6253 |
69
+ | 1.748 | 0.26 | 2000 | 1.5970 |
70
+ | 1.7503 | 0.27 | 2100 | 1.5893 |
71
+ | 1.7329 | 0.29 | 2200 | 1.5883 |
72
+ | 1.6826 | 0.3 | 2300 | 1.5781 |
73
+ | 1.7237 | 0.31 | 2400 | 1.5716 |
74
+ | 1.7358 | 0.32 | 2500 | 1.5671 |
75
+ | 1.7093 | 0.34 | 2600 | 1.5689 |
76
+ | 1.6771 | 0.35 | 2700 | 1.5654 |
77
+ | 1.6924 | 0.36 | 2800 | 1.5729 |
78
+ | 1.6768 | 0.38 | 2900 | 1.5545 |
79
+ | 1.7158 | 0.39 | 3000 | 1.5471 |
80
+ | 1.6808 | 0.4 | 3100 | 1.5415 |
81
+ | 1.6547 | 0.42 | 3200 | 1.5444 |
82
+ | 1.6557 | 0.43 | 3300 | 1.5400 |
83
+ | 1.6491 | 0.44 | 3400 | 1.5358 |
84
+ | 1.6757 | 0.45 | 3500 | 1.5244 |
85
+ | 1.6473 | 0.47 | 3600 | 1.5268 |
86
+ | 1.5987 | 0.48 | 3700 | 1.5201 |
87
+ | 1.6386 | 0.49 | 3800 | 1.5121 |
88
+ | 1.6568 | 0.51 | 3900 | 1.5004 |
89
+ | 1.6454 | 0.52 | 4000 | 1.4895 |
90
+ | 1.6175 | 0.53 | 4100 | 1.4974 |
91
+ | 1.6036 | 0.55 | 4200 | 1.4964 |
92
+ | 1.5785 | 0.56 | 4300 | 1.4882 |
93
+ | 1.6009 | 0.57 | 4400 | 1.4858 |
94
+ | 1.5723 | 0.58 | 4500 | 1.4755 |
95
+ | 1.6133 | 0.6 | 4600 | 1.4751 |
96
+ | 1.5683 | 0.61 | 4700 | 1.4692 |
97
+ | 1.5773 | 0.62 | 4800 | 1.4677 |
98
+ | 1.6005 | 0.64 | 4900 | 1.4645 |
99
+ | 1.5812 | 0.65 | 5000 | 1.4596 |
100
+ | 1.577 | 0.66 | 5100 | 1.4506 |
101
+ | 1.591 | 0.68 | 5200 | 1.4507 |
102
+ | 1.5609 | 0.69 | 5300 | 1.4474 |
103
+ | 1.5437 | 0.7 | 5400 | 1.4441 |
104
+ | 1.5535 | 0.71 | 5500 | 1.4430 |
105
+ | 1.5882 | 0.73 | 5600 | 1.4398 |
106
+ | 1.5731 | 0.74 | 5700 | 1.4328 |
107
+ | 1.5511 | 0.75 | 5800 | 1.4280 |
108
+ | 1.5455 | 0.77 | 5900 | 1.4358 |
109
+ | 1.5194 | 0.78 | 6000 | 1.4321 |
110
+ | 1.5524 | 0.79 | 6100 | 1.4207 |
111
+ | 1.5406 | 0.81 | 6200 | 1.4215 |
112
+ | 1.4811 | 0.82 | 6300 | 1.4293 |
113
+ | 1.5117 | 0.83 | 6400 | 1.4282 |
114
+ | 1.5197 | 0.84 | 6500 | 1.4109 |
115
+ | 1.558 | 0.86 | 6600 | 1.4241 |
116
+ | 1.5277 | 0.87 | 6700 | 1.4116 |
117
+ | 1.5346 | 0.88 | 6800 | 1.4190 |
118
+ | 1.4974 | 0.9 | 6900 | 1.4105 |
119
+ | 1.5345 | 0.91 | 7000 | 1.4163 |
120
+ | 1.5578 | 0.92 | 7100 | 1.4099 |
121
+ | 1.496 | 0.94 | 7200 | 1.4120 |
122
+ | 1.5192 | 0.95 | 7300 | 1.4073 |
123
+ | 1.456 | 0.96 | 7400 | 1.4105 |
124
+ | 1.4821 | 0.97 | 7500 | 1.4175 |
125
+ | 1.5331 | 0.99 | 7600 | 1.4135 |
126
+
127
+
128
+ ### Framework versions
129
+
130
+ - Transformers 4.32.1
131
+ - Pytorch 2.1.0
132
+ - Datasets 2.12.0
133
+ - Tokenizers 0.13.3
generation_config.json ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "pad_token_id": 0,
4
+ "transformers_version": "4.32.1"
5
+ }
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:1dc12e471364e9bc4eb7cfa0f2f1f9a5ade39f0dd5b9f145f020a2b5c9f181ec
3
  size 409230642
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:039059219a1cd573c7de1dd3dca0af7e938332a1522233223ad2d197feeb839f
3
  size 409230642