bigmorning commited on
Commit
d5e12e0
1 Parent(s): 8fa3548
Files changed (3) hide show
  1. README.md +132 -0
  2. config.json +23 -0
  3. tf_model.h5 +3 -0
README.md ADDED
@@ -0,0 +1,132 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_keras_callback
5
+ model-index:
6
+ - name: distilbert_oscarth_0080
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information Keras had access to. You should
11
+ probably proofread and complete it, then remove this comment. -->
12
+
13
+ # distilbert_oscarth_0080
14
+
15
+ This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
16
+ It achieves the following results on the evaluation set:
17
+ - Train Loss: 1.1236
18
+ - Validation Loss: 1.0821
19
+ - Epoch: 79
20
+
21
+ ## Model description
22
+
23
+ More information needed
24
+
25
+ ## Intended uses & limitations
26
+
27
+ More information needed
28
+
29
+ ## Training and evaluation data
30
+
31
+ More information needed
32
+
33
+ ## Training procedure
34
+
35
+ ### Training hyperparameters
36
+
37
+ The following hyperparameters were used during training:
38
+ - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
39
+ - training_precision: float32
40
+
41
+ ### Training results
42
+
43
+ | Train Loss | Validation Loss | Epoch |
44
+ |:----------:|:---------------:|:-----:|
45
+ | 4.1327 | 2.9983 | 0 |
46
+ | 2.7813 | 2.4562 | 1 |
47
+ | 2.4194 | 2.2066 | 2 |
48
+ | 2.2231 | 2.0562 | 3 |
49
+ | 2.0894 | 1.9450 | 4 |
50
+ | 1.9905 | 1.8621 | 5 |
51
+ | 1.9148 | 1.7941 | 6 |
52
+ | 1.8508 | 1.7363 | 7 |
53
+ | 1.7976 | 1.6909 | 8 |
54
+ | 1.7509 | 1.6488 | 9 |
55
+ | 1.7126 | 1.6124 | 10 |
56
+ | 1.6764 | 1.5835 | 11 |
57
+ | 1.6450 | 1.5521 | 12 |
58
+ | 1.6175 | 1.5282 | 13 |
59
+ | 1.5919 | 1.5045 | 14 |
60
+ | 1.5679 | 1.4833 | 15 |
61
+ | 1.5476 | 1.4627 | 16 |
62
+ | 1.5271 | 1.4498 | 17 |
63
+ | 1.5098 | 1.4270 | 18 |
64
+ | 1.4909 | 1.4161 | 19 |
65
+ | 1.4760 | 1.3995 | 20 |
66
+ | 1.4609 | 1.3864 | 21 |
67
+ | 1.4475 | 1.3717 | 22 |
68
+ | 1.4333 | 1.3590 | 23 |
69
+ | 1.4203 | 1.3478 | 24 |
70
+ | 1.4093 | 1.3403 | 25 |
71
+ | 1.3980 | 1.3296 | 26 |
72
+ | 1.3875 | 1.3176 | 27 |
73
+ | 1.3773 | 1.3094 | 28 |
74
+ | 1.3674 | 1.3011 | 29 |
75
+ | 1.3579 | 1.2920 | 30 |
76
+ | 1.3497 | 1.2826 | 31 |
77
+ | 1.3400 | 1.2764 | 32 |
78
+ | 1.3326 | 1.2694 | 33 |
79
+ | 1.3236 | 1.2635 | 34 |
80
+ | 1.3169 | 1.2536 | 35 |
81
+ | 1.3096 | 1.2477 | 36 |
82
+ | 1.3024 | 1.2408 | 37 |
83
+ | 1.2957 | 1.2364 | 38 |
84
+ | 1.2890 | 1.2296 | 39 |
85
+ | 1.2818 | 1.2236 | 40 |
86
+ | 1.2751 | 1.2168 | 41 |
87
+ | 1.2691 | 1.2126 | 42 |
88
+ | 1.2644 | 1.2044 | 43 |
89
+ | 1.2583 | 1.2008 | 44 |
90
+ | 1.2529 | 1.1962 | 45 |
91
+ | 1.2473 | 1.1919 | 46 |
92
+ | 1.2416 | 1.1857 | 47 |
93
+ | 1.2365 | 1.1812 | 48 |
94
+ | 1.2318 | 1.1765 | 49 |
95
+ | 1.2273 | 1.1738 | 50 |
96
+ | 1.2224 | 1.1672 | 51 |
97
+ | 1.2177 | 1.1673 | 52 |
98
+ | 1.2132 | 1.1595 | 53 |
99
+ | 1.2084 | 1.1564 | 54 |
100
+ | 1.2033 | 1.1518 | 55 |
101
+ | 1.1993 | 1.1481 | 56 |
102
+ | 1.1966 | 1.1445 | 57 |
103
+ | 1.1924 | 1.1412 | 58 |
104
+ | 1.1876 | 1.1378 | 59 |
105
+ | 1.1834 | 1.1340 | 60 |
106
+ | 1.1806 | 1.1329 | 61 |
107
+ | 1.1783 | 1.1289 | 62 |
108
+ | 1.1739 | 1.1251 | 63 |
109
+ | 1.1705 | 1.1223 | 64 |
110
+ | 1.1669 | 1.1192 | 65 |
111
+ | 1.1628 | 1.1172 | 66 |
112
+ | 1.1599 | 1.1140 | 67 |
113
+ | 1.1570 | 1.1084 | 68 |
114
+ | 1.1526 | 1.1081 | 69 |
115
+ | 1.1496 | 1.1043 | 70 |
116
+ | 1.1463 | 1.0999 | 71 |
117
+ | 1.1438 | 1.1006 | 72 |
118
+ | 1.1397 | 1.0964 | 73 |
119
+ | 1.1378 | 1.0918 | 74 |
120
+ | 1.1347 | 1.0917 | 75 |
121
+ | 1.1319 | 1.0889 | 76 |
122
+ | 1.1296 | 1.0855 | 77 |
123
+ | 1.1271 | 1.0848 | 78 |
124
+ | 1.1236 | 1.0821 | 79 |
125
+
126
+
127
+ ### Framework versions
128
+
129
+ - Transformers 4.20.1
130
+ - TensorFlow 2.8.2
131
+ - Datasets 2.3.2
132
+ - Tokenizers 0.12.1
config.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "distilbert-base-uncased",
3
+ "activation": "gelu",
4
+ "architectures": [
5
+ "DistilBertForMaskedLM"
6
+ ],
7
+ "attention_dropout": 0.1,
8
+ "dim": 768,
9
+ "dropout": 0.1,
10
+ "hidden_dim": 3072,
11
+ "initializer_range": 0.02,
12
+ "max_position_embeddings": 512,
13
+ "model_type": "distilbert",
14
+ "n_heads": 12,
15
+ "n_layers": 6,
16
+ "pad_token_id": 0,
17
+ "qa_dropout": 0.1,
18
+ "seq_classif_dropout": 0.2,
19
+ "sinusoidal_pos_embds": false,
20
+ "tie_weights_": true,
21
+ "transformers_version": "4.20.1",
22
+ "vocab_size": 27672
23
+ }
tf_model.h5 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bf8746cadded96a778a92f6477dddc7e3808536da0c991dc02169a664fb211aa
3
+ size 431026784