tomaarsen HF staff commited on
Commit
e484e47
1 Parent(s): ccce246

Upload model

Browse files
README.md ADDED
@@ -0,0 +1,369 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ - multilingual
5
+ license: cc-by-sa-4.0
6
+ library_name: span-marker
7
+ tags:
8
+ - span-marker
9
+ - token-classification
10
+ - ner
11
+ - named-entity-recognition
12
+ - generated_from_span_marker_trainer
13
+ datasets:
14
+ - DFKI-SLT/few-nerd
15
+ metrics:
16
+ - precision
17
+ - recall
18
+ - f1
19
+ widget:
20
+ - text: The WPC led the international peace movement in the decade after the Second
21
+ World War, but its failure to speak out against the Soviet suppression of the
22
+ 1956 Hungarian uprising and the resumption of Soviet nuclear tests in 1961 marginalised
23
+ it, and in the 1960s it was eclipsed by the newer, non-aligned peace organizations
24
+ like the Campaign for Nuclear Disarmament.
25
+ - text: Most of the Steven Seagal movie "Under Siege "(co-starring Tommy Lee Jones)
26
+ was filmed on the, which is docked on Mobile Bay at Battleship Memorial Park and
27
+ open to the public.
28
+ - text: 'The Central African CFA franc (French: "franc CFA "or simply "franc ", ISO
29
+ 4217 code: XAF) is the currency of six independent states in Central Africa: Cameroon,
30
+ Central African Republic, Chad, Republic of the Congo, Equatorial Guinea and Gabon.'
31
+ - text: Brenner conducted post-doctoral research at Brandeis University with Gregory
32
+ Petsko and then took his first academic position at Thomas Jefferson University
33
+ in 1996, moving to Dartmouth Medical School in 2003, where he served as Associate
34
+ Director for Basic Sciences at Norris Cotton Cancer Center.
35
+ - text: On Friday, October 27, 2017, the Senate of Spain (Senado) voted 214 to 47
36
+ to invoke Article 155 of the Spanish Constitution over Catalonia after the Catalan
37
+ Parliament declared the independence.
38
+ pipeline_tag: token-classification
39
+ co2_eq_emissions:
40
+ emissions: 572.6675932546113
41
+ source: codecarbon
42
+ training_type: fine-tuning
43
+ on_cloud: false
44
+ cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K
45
+ ram_total_size: 31.777088165283203
46
+ hours_used: 3.867
47
+ hardware_used: 1 x NVIDIA GeForce RTX 3090
48
+ base_model: bert-base-multilingual-cased
49
+ model-index:
50
+ - name: SpanMarker with bert-base-multilingual-cased on FewNERD
51
+ results:
52
+ - task:
53
+ type: token-classification
54
+ name: Named Entity Recognition
55
+ dataset:
56
+ name: FewNERD
57
+ type: DFKI-SLT/few-nerd
58
+ split: test
59
+ metrics:
60
+ - type: f1
61
+ value: 0.7006507253689264
62
+ name: F1
63
+ - type: precision
64
+ value: 0.7040676584045078
65
+ name: Precision
66
+ - type: recall
67
+ value: 0.6972667978051558
68
+ name: Recall
69
+ ---
70
+
71
+ # SpanMarker with bert-base-multilingual-cased on FewNERD
72
+
73
+ This is a [SpanMarker](https://github.com/tomaarsen/SpanMarkerNER) model trained on the [FewNERD](https://huggingface.co/datasets/DFKI-SLT/few-nerd) dataset that can be used for Named Entity Recognition. This SpanMarker model uses [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) as the underlying encoder.
74
+
75
+ ## Model Details
76
+
77
+ ### Model Description
78
+ - **Model Type:** SpanMarker
79
+ - **Encoder:** [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased)
80
+ - **Maximum Sequence Length:** 256 tokens
81
+ - **Maximum Entity Length:** 8 words
82
+ - **Training Dataset:** [FewNERD](https://huggingface.co/datasets/DFKI-SLT/few-nerd)
83
+ - **Languages:** en, multilingual
84
+ - **License:** cc-by-sa-4.0
85
+
86
+ ### Model Sources
87
+
88
+ - **Repository:** [SpanMarker on GitHub](https://github.com/tomaarsen/SpanMarkerNER)
89
+ - **Thesis:** [SpanMarker For Named Entity Recognition](https://raw.githubusercontent.com/tomaarsen/SpanMarkerNER/main/thesis.pdf)
90
+
91
+ ### Model Labels
92
+ | Label | Examples |
93
+ |:-----------------------------------------|:---------------------------------------------------------------------------------------------------------|
94
+ | art-broadcastprogram | "Corazones", "Street Cents", "The Gale Storm Show : Oh , Susanna" |
95
+ | art-film | "L'Atlantide", "Bosch", "Shawshank Redemption" |
96
+ | art-music | "Atkinson , Danko and Ford ( with Brockie and Hilton )", "Hollywood Studio Symphony", "Champion Lover" |
97
+ | art-other | "Aphrodite of Milos", "The Today Show", "Venus de Milo" |
98
+ | art-painting | "Production/Reproduction", "Touit", "Cofiwch Dryweryn" |
99
+ | art-writtenart | "The Seven Year Itch", "Time", "Imelda de ' Lambertazzi" |
100
+ | building-airport | "Luton Airport", "Newark Liberty International Airport", "Sheremetyevo International Airport" |
101
+ | building-hospital | "Hokkaido University Hospital", "Yeungnam University Hospital", "Memorial Sloan-Kettering Cancer Center" |
102
+ | building-hotel | "Flamingo Hotel", "The Standard Hotel", "Radisson Blu Sea Plaza Hotel" |
103
+ | building-library | "British Library", "Bayerische Staatsbibliothek", "Berlin State Library" |
104
+ | building-other | "Communiplex", "Henry Ford Museum", "Alpha Recording Studios" |
105
+ | building-restaurant | "Fatburger", "Carnegie Deli", "Trumbull" |
106
+ | building-sportsfacility | "Sports Center", "Glenn Warner Soccer Facility", "Boston Garden" |
107
+ | building-theater | "Sanders Theatre", "Pittsburgh Civic Light Opera", "National Paris Opera" |
108
+ | event-attack/battle/war/militaryconflict | "Vietnam War", "Jurist", "Easter Offensive" |
109
+ | event-disaster | "1693 Sicily earthquake", "the 1912 North Mount Lyell Disaster", "1990s North Korean famine" |
110
+ | event-election | "March 1898 elections", "1982 Mitcham and Morden by-election", "Elections to the European Parliament" |
111
+ | event-other | "Eastwood Scoring Stage", "Masaryk Democratic Movement", "Union for a Popular Movement" |
112
+ | event-protest | "Russian Revolution", "Iranian Constitutional Revolution", "French Revolution" |
113
+ | event-sportsevent | "Stanley Cup", "World Cup", "National Champions" |
114
+ | location-GPE | "Mediterranean Basin", "Croatian", "the Republic of Croatia" |
115
+ | location-bodiesofwater | "Norfolk coast", "Atatürk Dam Lake", "Arthur Kill" |
116
+ | location-island | "Staten Island", "Laccadives", "new Samsat district" |
117
+ | location-mountain | "Miteirya Ridge", "Ruweisat Ridge", "Salamander Glacier" |
118
+ | location-other | "Victoria line", "Cartuther", "Northern City Line" |
119
+ | location-park | "Painted Desert Community Complex Historic District", "Shenandoah National Park", "Gramercy Park" |
120
+ | location-road/railway/highway/transit | "Friern Barnet Road", "Newark-Elizabeth Rail Link", "NJT" |
121
+ | organization-company | "Church 's Chicken", "Dixy Chicken", "Texas Chicken" |
122
+ | organization-education | "MIT", "Barnard College", "Belfast Royal Academy and the Ulster College of Physical Education" |
123
+ | organization-government/governmentagency | "Supreme Court", "Diet", "Congregazione dei Nobili" |
124
+ | organization-media/newspaper | "TimeOut Melbourne", "Clash", "Al Jazeera" |
125
+ | organization-other | "IAEA", "Defence Sector C", "4th Army" |
126
+ | organization-politicalparty | "Al Wafa ' Islamic", "Kenseitō", "Shimpotō" |
127
+ | organization-religion | "Christian", "UPCUSA", "Jewish" |
128
+ | organization-showorganization | "Lizzy", "Mr. Mister", "Bochumer Symphoniker" |
129
+ | organization-sportsleague | "China League One", "NHL", "First Division" |
130
+ | organization-sportsteam | "Luc Alphand Aventures", "Tottenham", "Arsenal" |
131
+ | other-astronomything | "`` Caput Larvae ''", "Algol", "Zodiac" |
132
+ | other-award | "GCON", "Order of the Republic of Guinea and Nigeria", "Grand Commander of the Order of the Niger" |
133
+ | other-biologything | "BAR", "Amphiphysin", "N-terminal lipid" |
134
+ | other-chemicalthing | "sulfur", "uranium", "carbon dioxide" |
135
+ | other-currency | "Travancore Rupee", "$", "lac crore" |
136
+ | other-disease | "bladder cancer", "hypothyroidism", "French Dysentery Epidemic of 1779" |
137
+ | other-educationaldegree | "Master", "Bachelor", "BSc ( Hons ) in physics" |
138
+ | other-god | "Fujin", "Raijin", "El" |
139
+ | other-language | "Latin", "English", "Breton-speaking" |
140
+ | other-law | "Thirty Years ' Peace", "United States Freedom Support Act", "Leahy–Smith America Invents Act ( AIA" |
141
+ | other-livingthing | "monkeys", "insects", "patchouli" |
142
+ | other-medical | "Pediatrics", "amitriptyline", "pediatrician" |
143
+ | person-actor | "Edmund Payne", "Ellaline Terriss", "Tchéky Karyo" |
144
+ | person-artist/author | "George Axelrod", "Hicks", "Gaetano Donizett" |
145
+ | person-athlete | "Tozawa", "Neville", "Jaguar" |
146
+ | person-director | "Richard Quine", "Frank Darabont", "Bob Swaim" |
147
+ | person-other | "Richard Benson", "Campbell", "Holden" |
148
+ | person-politician | "Rivière", "William", "Emeric" |
149
+ | person-scholar | "Wurdack", "Stedman", "Stalmine" |
150
+ | person-soldier | "Joachim Ziegler", "Krukenberg", "Helmuth Weidling" |
151
+ | product-airplane | "Luton", "Spey-equipped FGR.2s", "EC135T2 CPDS" |
152
+ | product-car | "Corvettes - GT1 C6R", "Phantom", "100EX" |
153
+ | product-food | "V. labrusca", "yakiniku", "red grape" |
154
+ | product-game | "Airforce Delta", "Hardcore RPG", "Splinter Cell" |
155
+ | product-other | "PDP-1", "Fairbottom Bobs", "X11" |
156
+ | product-ship | "HMS `` Chinkara ''", "Congress", "Essex" |
157
+ | product-software | "Apdf", "Wikipedia", "AmiPDF" |
158
+ | product-train | "Royal Scots Grey", "High Speed Trains", "55022" |
159
+ | product-weapon | "AR-15 's", "ZU-23-2M Wróbel", "ZU-23-2MR Wróbel II" |
160
+
161
+ ## Evaluation
162
+
163
+ ### Metrics
164
+ | Label | Precision | Recall | F1 |
165
+ |:-----------------------------------------|:----------|:-------|:-------|
166
+ | **all** | 0.7041 | 0.6973 | 0.7007 |
167
+ | art-broadcastprogram | 0.5863 | 0.6252 | 0.6051 |
168
+ | art-film | 0.7779 | 0.752 | 0.7647 |
169
+ | art-music | 0.8014 | 0.7570 | 0.7786 |
170
+ | art-other | 0.4209 | 0.3221 | 0.3649 |
171
+ | art-painting | 0.5938 | 0.6667 | 0.6281 |
172
+ | art-writtenart | 0.6854 | 0.6415 | 0.6628 |
173
+ | building-airport | 0.8197 | 0.8242 | 0.8219 |
174
+ | building-hospital | 0.7215 | 0.8187 | 0.7671 |
175
+ | building-hotel | 0.7233 | 0.6906 | 0.7066 |
176
+ | building-library | 0.7588 | 0.7268 | 0.7424 |
177
+ | building-other | 0.5842 | 0.5855 | 0.5848 |
178
+ | building-restaurant | 0.5567 | 0.4871 | 0.5195 |
179
+ | building-sportsfacility | 0.6512 | 0.7690 | 0.7052 |
180
+ | building-theater | 0.6994 | 0.7516 | 0.7246 |
181
+ | event-attack/battle/war/militaryconflict | 0.7800 | 0.7332 | 0.7559 |
182
+ | event-disaster | 0.5767 | 0.5266 | 0.5505 |
183
+ | event-election | 0.5106 | 0.1319 | 0.2096 |
184
+ | event-other | 0.4931 | 0.4145 | 0.4504 |
185
+ | event-protest | 0.3711 | 0.4337 | 0.4000 |
186
+ | event-sportsevent | 0.6156 | 0.6156 | 0.6156 |
187
+ | location-GPE | 0.8175 | 0.8508 | 0.8338 |
188
+ | location-bodiesofwater | 0.7297 | 0.7622 | 0.7456 |
189
+ | location-island | 0.7314 | 0.6703 | 0.6995 |
190
+ | location-mountain | 0.7538 | 0.7283 | 0.7409 |
191
+ | location-other | 0.4370 | 0.3040 | 0.3585 |
192
+ | location-park | 0.7063 | 0.6878 | 0.6969 |
193
+ | location-road/railway/highway/transit | 0.7092 | 0.7259 | 0.7174 |
194
+ | organization-company | 0.6911 | 0.6943 | 0.6927 |
195
+ | organization-education | 0.7799 | 0.7973 | 0.7885 |
196
+ | organization-government/governmentagency | 0.5518 | 0.4474 | 0.4942 |
197
+ | organization-media/newspaper | 0.6268 | 0.6761 | 0.6505 |
198
+ | organization-other | 0.5804 | 0.5341 | 0.5563 |
199
+ | organization-politicalparty | 0.6627 | 0.7306 | 0.6949 |
200
+ | organization-religion | 0.5636 | 0.6265 | 0.5934 |
201
+ | organization-showorganization | 0.6023 | 0.6086 | 0.6054 |
202
+ | organization-sportsleague | 0.6594 | 0.6497 | 0.6545 |
203
+ | organization-sportsteam | 0.7341 | 0.7703 | 0.7518 |
204
+ | other-astronomything | 0.7806 | 0.8289 | 0.8040 |
205
+ | other-award | 0.7230 | 0.6703 | 0.6957 |
206
+ | other-biologything | 0.6733 | 0.6366 | 0.6544 |
207
+ | other-chemicalthing | 0.5962 | 0.5838 | 0.5899 |
208
+ | other-currency | 0.7135 | 0.7822 | 0.7463 |
209
+ | other-disease | 0.6260 | 0.7063 | 0.6637 |
210
+ | other-educationaldegree | 0.6 | 0.6033 | 0.6016 |
211
+ | other-god | 0.7051 | 0.7118 | 0.7085 |
212
+ | other-language | 0.6849 | 0.7968 | 0.7366 |
213
+ | other-law | 0.6814 | 0.6843 | 0.6829 |
214
+ | other-livingthing | 0.5959 | 0.6443 | 0.6192 |
215
+ | other-medical | 0.5247 | 0.4811 | 0.5020 |
216
+ | person-actor | 0.8342 | 0.7960 | 0.8146 |
217
+ | person-artist/author | 0.7052 | 0.7482 | 0.7261 |
218
+ | person-athlete | 0.8396 | 0.8530 | 0.8462 |
219
+ | person-director | 0.725 | 0.7329 | 0.7289 |
220
+ | person-other | 0.6866 | 0.6672 | 0.6767 |
221
+ | person-politician | 0.6819 | 0.6852 | 0.6835 |
222
+ | person-scholar | 0.5468 | 0.4953 | 0.5198 |
223
+ | person-soldier | 0.5360 | 0.5641 | 0.5497 |
224
+ | product-airplane | 0.6825 | 0.6730 | 0.6777 |
225
+ | product-car | 0.7205 | 0.7016 | 0.7109 |
226
+ | product-food | 0.6036 | 0.5394 | 0.5697 |
227
+ | product-game | 0.7740 | 0.6876 | 0.7282 |
228
+ | product-other | 0.5250 | 0.4117 | 0.4615 |
229
+ | product-ship | 0.6781 | 0.6763 | 0.6772 |
230
+ | product-software | 0.6701 | 0.6603 | 0.6652 |
231
+ | product-train | 0.5919 | 0.6051 | 0.5984 |
232
+ | product-weapon | 0.6507 | 0.5433 | 0.5921 |
233
+
234
+ ## Uses
235
+
236
+ ### Direct Use for Inference
237
+
238
+ ```python
239
+ from span_marker import SpanMarkerModel
240
+
241
+ # Download from the 🤗 Hub
242
+ model = SpanMarkerModel.from_pretrained("tomaarsen/span-marker-mbert-base-fewnerd-fine-super")
243
+ # Run inference
244
+ entities = model.predict("Most of the Steven Seagal movie \"Under Siege \"(co-starring Tommy Lee Jones) was filmed on the, which is docked on Mobile Bay at Battleship Memorial Park and open to the public.")
245
+ ```
246
+
247
+ ### Downstream Use
248
+ You can finetune this model on your own dataset.
249
+
250
+ <details><summary>Click to expand</summary>
251
+
252
+ ```python
253
+ from span_marker import SpanMarkerModel, Trainer
254
+
255
+ # Download from the 🤗 Hub
256
+ model = SpanMarkerModel.from_pretrained("tomaarsen/span-marker-mbert-base-fewnerd-fine-super")
257
+
258
+ # Specify a Dataset with "tokens" and "ner_tag" columns
259
+ dataset = load_dataset("conll2003") # For example CoNLL2003
260
+
261
+ # Initialize a Trainer using the pretrained model & dataset
262
+ trainer = Trainer(
263
+ model=model,
264
+ train_dataset=dataset["train"],
265
+ eval_dataset=dataset["validation"],
266
+ )
267
+ trainer.train()
268
+ trainer.save_model("tomaarsen/span-marker-mbert-base-fewnerd-fine-super-finetuned")
269
+ ```
270
+ </details>
271
+
272
+ <!--
273
+ ### Out-of-Scope Use
274
+
275
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
276
+ -->
277
+
278
+ <!--
279
+ ## Bias, Risks and Limitations
280
+
281
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
282
+ -->
283
+
284
+ <!--
285
+ ### Recommendations
286
+
287
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
288
+ -->
289
+
290
+ ## Training Details
291
+
292
+ ### Training Set Metrics
293
+ | Training set | Min | Median | Max |
294
+ |:----------------------|:----|:--------|:----|
295
+ | Sentence length | 1 | 24.4945 | 267 |
296
+ | Entities per sentence | 0 | 2.5832 | 88 |
297
+
298
+ ### Training Hyperparameters
299
+ - learning_rate: 5e-05
300
+ - train_batch_size: 16
301
+ - eval_batch_size: 16
302
+ - seed: 42
303
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
304
+ - lr_scheduler_type: linear
305
+ - lr_scheduler_warmup_ratio: 0.1
306
+ - num_epochs: 3
307
+
308
+ ### Training Results
309
+ | Epoch | Step | Validation Loss | Validation Precision | Validation Recall | Validation F1 | Validation Accuracy |
310
+ |:------:|:-----:|:---------------:|:--------------------:|:-----------------:|:-------------:|:-------------------:|
311
+ | 0.2972 | 3000 | 0.0274 | 0.6488 | 0.6457 | 0.6473 | 0.9121 |
312
+ | 0.5944 | 6000 | 0.0252 | 0.6686 | 0.6545 | 0.6615 | 0.9160 |
313
+ | 0.8915 | 9000 | 0.0239 | 0.6918 | 0.6547 | 0.6727 | 0.9178 |
314
+ | 1.1887 | 12000 | 0.0235 | 0.6962 | 0.6727 | 0.6842 | 0.9210 |
315
+ | 1.4859 | 15000 | 0.0233 | 0.6872 | 0.6742 | 0.6806 | 0.9201 |
316
+ | 1.7831 | 18000 | 0.0226 | 0.6969 | 0.6891 | 0.6929 | 0.9236 |
317
+ | 2.0802 | 21000 | 0.0231 | 0.7030 | 0.6916 | 0.6973 | 0.9246 |
318
+ | 2.3774 | 24000 | 0.0227 | 0.7020 | 0.6936 | 0.6978 | 0.9248 |
319
+ | 2.6746 | 27000 | 0.0223 | 0.7079 | 0.6989 | 0.7034 | 0.9258 |
320
+ | 2.9718 | 30000 | 0.0222 | 0.7089 | 0.7009 | 0.7049 | 0.9263 |
321
+
322
+ ### Environmental Impact
323
+ Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon).
324
+ - **Carbon Emitted**: 0.573 kg of CO2
325
+ - **Hours Used**: 3.867 hours
326
+
327
+ ### Training Hardware
328
+ - **On Cloud**: No
329
+ - **GPU Model**: 1 x NVIDIA GeForce RTX 3090
330
+ - **CPU Model**: 13th Gen Intel(R) Core(TM) i7-13700K
331
+ - **RAM Size**: 31.78 GB
332
+
333
+ ### Framework Versions
334
+ - Python: 3.9.16
335
+ - SpanMarker: 1.4.1.dev
336
+ - Transformers: 4.30.0
337
+ - PyTorch: 2.0.1+cu118
338
+ - Datasets: 2.14.0
339
+ - Tokenizers: 0.13.2
340
+
341
+ ## Citation
342
+
343
+ ### BibTeX
344
+ ```
345
+ @software{Aarsen_SpanMarker,
346
+ author = {Aarsen, Tom},
347
+ license = {Apache-2.0},
348
+ title = {{SpanMarker for Named Entity Recognition}},
349
+ url = {https://github.com/tomaarsen/SpanMarkerNER}
350
+ }
351
+ ```
352
+
353
+ <!--
354
+ ## Glossary
355
+
356
+ *Clearly define terms in order to be accessible across audiences.*
357
+ -->
358
+
359
+ <!--
360
+ ## Model Card Authors
361
+
362
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
363
+ -->
364
+
365
+ <!--
366
+ ## Model Card Contact
367
+
368
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
369
+ -->
added_tokens.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "<end>": 119548,
3
+ "<start>": 119547
4
+ }
config.json ADDED
@@ -0,0 +1,234 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "SpanMarkerModel"
4
+ ],
5
+ "encoder": {
6
+ "_name_or_path": "bert-base-multilingual-cased",
7
+ "add_cross_attention": false,
8
+ "architectures": [
9
+ "BertForMaskedLM"
10
+ ],
11
+ "attention_probs_dropout_prob": 0.1,
12
+ "bad_words_ids": null,
13
+ "begin_suppress_tokens": null,
14
+ "bos_token_id": null,
15
+ "chunk_size_feed_forward": 0,
16
+ "classifier_dropout": null,
17
+ "cross_attention_hidden_size": null,
18
+ "decoder_start_token_id": null,
19
+ "directionality": "bidi",
20
+ "diversity_penalty": 0.0,
21
+ "do_sample": false,
22
+ "early_stopping": false,
23
+ "encoder_no_repeat_ngram_size": 0,
24
+ "eos_token_id": null,
25
+ "exponential_decay_length_penalty": null,
26
+ "finetuning_task": null,
27
+ "forced_bos_token_id": null,
28
+ "forced_eos_token_id": null,
29
+ "hidden_act": "gelu",
30
+ "hidden_dropout_prob": 0.1,
31
+ "hidden_size": 768,
32
+ "id2label": {
33
+ "0": "O",
34
+ "1": "art-broadcastprogram",
35
+ "2": "art-film",
36
+ "3": "art-music",
37
+ "4": "art-other",
38
+ "5": "art-painting",
39
+ "6": "art-writtenart",
40
+ "7": "building-airport",
41
+ "8": "building-hospital",
42
+ "9": "building-hotel",
43
+ "10": "building-library",
44
+ "11": "building-other",
45
+ "12": "building-restaurant",
46
+ "13": "building-sportsfacility",
47
+ "14": "building-theater",
48
+ "15": "event-attack/battle/war/militaryconflict",
49
+ "16": "event-disaster",
50
+ "17": "event-election",
51
+ "18": "event-other",
52
+ "19": "event-protest",
53
+ "20": "event-sportsevent",
54
+ "21": "location-GPE",
55
+ "22": "location-bodiesofwater",
56
+ "23": "location-island",
57
+ "24": "location-mountain",
58
+ "25": "location-other",
59
+ "26": "location-park",
60
+ "27": "location-road/railway/highway/transit",
61
+ "28": "organization-company",
62
+ "29": "organization-education",
63
+ "30": "organization-government/governmentagency",
64
+ "31": "organization-media/newspaper",
65
+ "32": "organization-other",
66
+ "33": "organization-politicalparty",
67
+ "34": "organization-religion",
68
+ "35": "organization-showorganization",
69
+ "36": "organization-sportsleague",
70
+ "37": "organization-sportsteam",
71
+ "38": "other-astronomything",
72
+ "39": "other-award",
73
+ "40": "other-biologything",
74
+ "41": "other-chemicalthing",
75
+ "42": "other-currency",
76
+ "43": "other-disease",
77
+ "44": "other-educationaldegree",
78
+ "45": "other-god",
79
+ "46": "other-language",
80
+ "47": "other-law",
81
+ "48": "other-livingthing",
82
+ "49": "other-medical",
83
+ "50": "person-actor",
84
+ "51": "person-artist/author",
85
+ "52": "person-athlete",
86
+ "53": "person-director",
87
+ "54": "person-other",
88
+ "55": "person-politician",
89
+ "56": "person-scholar",
90
+ "57": "person-soldier",
91
+ "58": "product-airplane",
92
+ "59": "product-car",
93
+ "60": "product-food",
94
+ "61": "product-game",
95
+ "62": "product-other",
96
+ "63": "product-ship",
97
+ "64": "product-software",
98
+ "65": "product-train",
99
+ "66": "product-weapon"
100
+ },
101
+ "initializer_range": 0.02,
102
+ "intermediate_size": 3072,
103
+ "is_decoder": false,
104
+ "is_encoder_decoder": false,
105
+ "label2id": {
106
+ "O": 0,
107
+ "art-broadcastprogram": 1,
108
+ "art-film": 2,
109
+ "art-music": 3,
110
+ "art-other": 4,
111
+ "art-painting": 5,
112
+ "art-writtenart": 6,
113
+ "building-airport": 7,
114
+ "building-hospital": 8,
115
+ "building-hotel": 9,
116
+ "building-library": 10,
117
+ "building-other": 11,
118
+ "building-restaurant": 12,
119
+ "building-sportsfacility": 13,
120
+ "building-theater": 14,
121
+ "event-attack/battle/war/militaryconflict": 15,
122
+ "event-disaster": 16,
123
+ "event-election": 17,
124
+ "event-other": 18,
125
+ "event-protest": 19,
126
+ "event-sportsevent": 20,
127
+ "location-GPE": 21,
128
+ "location-bodiesofwater": 22,
129
+ "location-island": 23,
130
+ "location-mountain": 24,
131
+ "location-other": 25,
132
+ "location-park": 26,
133
+ "location-road/railway/highway/transit": 27,
134
+ "organization-company": 28,
135
+ "organization-education": 29,
136
+ "organization-government/governmentagency": 30,
137
+ "organization-media/newspaper": 31,
138
+ "organization-other": 32,
139
+ "organization-politicalparty": 33,
140
+ "organization-religion": 34,
141
+ "organization-showorganization": 35,
142
+ "organization-sportsleague": 36,
143
+ "organization-sportsteam": 37,
144
+ "other-astronomything": 38,
145
+ "other-award": 39,
146
+ "other-biologything": 40,
147
+ "other-chemicalthing": 41,
148
+ "other-currency": 42,
149
+ "other-disease": 43,
150
+ "other-educationaldegree": 44,
151
+ "other-god": 45,
152
+ "other-language": 46,
153
+ "other-law": 47,
154
+ "other-livingthing": 48,
155
+ "other-medical": 49,
156
+ "person-actor": 50,
157
+ "person-artist/author": 51,
158
+ "person-athlete": 52,
159
+ "person-director": 53,
160
+ "person-other": 54,
161
+ "person-politician": 55,
162
+ "person-scholar": 56,
163
+ "person-soldier": 57,
164
+ "product-airplane": 58,
165
+ "product-car": 59,
166
+ "product-food": 60,
167
+ "product-game": 61,
168
+ "product-other": 62,
169
+ "product-ship": 63,
170
+ "product-software": 64,
171
+ "product-train": 65,
172
+ "product-weapon": 66
173
+ },
174
+ "layer_norm_eps": 1e-12,
175
+ "length_penalty": 1.0,
176
+ "max_length": 20,
177
+ "max_position_embeddings": 512,
178
+ "min_length": 0,
179
+ "model_type": "bert",
180
+ "no_repeat_ngram_size": 0,
181
+ "num_attention_heads": 12,
182
+ "num_beam_groups": 1,
183
+ "num_beams": 1,
184
+ "num_hidden_layers": 12,
185
+ "num_return_sequences": 1,
186
+ "output_attentions": false,
187
+ "output_hidden_states": false,
188
+ "output_scores": false,
189
+ "pad_token_id": 0,
190
+ "pooler_fc_size": 768,
191
+ "pooler_num_attention_heads": 12,
192
+ "pooler_num_fc_layers": 3,
193
+ "pooler_size_per_head": 128,
194
+ "pooler_type": "first_token_transform",
195
+ "position_embedding_type": "absolute",
196
+ "prefix": null,
197
+ "problem_type": null,
198
+ "pruned_heads": {},
199
+ "remove_invalid_values": false,
200
+ "repetition_penalty": 1.0,
201
+ "return_dict": true,
202
+ "return_dict_in_generate": false,
203
+ "sep_token_id": null,
204
+ "suppress_tokens": null,
205
+ "task_specific_params": null,
206
+ "temperature": 1.0,
207
+ "tf_legacy_loss": false,
208
+ "tie_encoder_decoder": false,
209
+ "tie_word_embeddings": true,
210
+ "tokenizer_class": null,
211
+ "top_k": 50,
212
+ "top_p": 1.0,
213
+ "torch_dtype": null,
214
+ "torchscript": false,
215
+ "transformers_version": "4.30.0",
216
+ "type_vocab_size": 2,
217
+ "typical_p": 1.0,
218
+ "use_bfloat16": false,
219
+ "use_cache": true,
220
+ "vocab_size": 119549
221
+ },
222
+ "entity_max_length": 8,
223
+ "marker_max_length": 128,
224
+ "max_next_context": null,
225
+ "max_prev_context": null,
226
+ "model_max_length": 256,
227
+ "model_max_length_default": 512,
228
+ "model_type": "span-marker",
229
+ "span_marker_version": "1.4.1.dev",
230
+ "torch_dtype": "float32",
231
+ "trained_with_document_context": false,
232
+ "transformers_version": "4.30.0",
233
+ "vocab_size": 119549
234
+ }
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cbc4693a38a00bcbb27618c509c4ce1d7aaa38fb2adc34133780b1d70e03344f
3
+ size 711905205
special_tokens_map.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": "[CLS]",
3
+ "mask_token": "[MASK]",
4
+ "pad_token": "[PAD]",
5
+ "sep_token": "[SEP]",
6
+ "unk_token": "[UNK]"
7
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": true,
3
+ "clean_up_tokenization_spaces": true,
4
+ "cls_token": "[CLS]",
5
+ "do_lower_case": false,
6
+ "entity_max_length": 8,
7
+ "marker_max_length": 128,
8
+ "mask_token": "[MASK]",
9
+ "model_max_length": 256,
10
+ "pad_token": "[PAD]",
11
+ "sep_token": "[SEP]",
12
+ "strip_accents": null,
13
+ "tokenize_chinese_chars": true,
14
+ "tokenizer_class": "BertTokenizer",
15
+ "unk_token": "[UNK]"
16
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff