initial model commit
Browse files- README.md +148 -0
- loss.tsv +151 -0
- pytorch_model.bin +3 -0
- training.log +0 -0
README.md
ADDED
@@ -0,0 +1,148 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
tags:
|
3 |
+
- flair
|
4 |
+
- token-classification
|
5 |
+
- sequence-tagger-model
|
6 |
+
language: en de nl es
|
7 |
+
datasets:
|
8 |
+
- conll2003
|
9 |
+
inference: false
|
10 |
+
---
|
11 |
+
|
12 |
+
## 4-Language NER in Flair (English, German, Dutch and Spanish)
|
13 |
+
|
14 |
+
This is the standard 4-class NER model for 4 CoNLL-03 languages that ships with [Flair](https://github.com/flairNLP/flair/). Also kind of works for related languages like French.
|
15 |
+
|
16 |
+
F1-Score: **92,16** (CoNLL-03 English), **87,33** (CoNLL-03 German revised), **88,96** (CoNLL-03 Dutch), **86,65** (CoNLL-03 Spanish)
|
17 |
+
|
18 |
+
|
19 |
+
Predicts 4 tags:
|
20 |
+
|
21 |
+
| **tag** | **meaning** |
|
22 |
+
|---------------------------------|-----------|
|
23 |
+
| PER | person name |
|
24 |
+
| LOC | location name |
|
25 |
+
| ORG | organization name |
|
26 |
+
| MISC | other name |
|
27 |
+
|
28 |
+
Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
|
29 |
+
|
30 |
+
---
|
31 |
+
|
32 |
+
### Demo: How to use in Flair
|
33 |
+
|
34 |
+
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
|
35 |
+
|
36 |
+
```python
|
37 |
+
from flair.data import Sentence
|
38 |
+
from flair.models import SequenceTagger
|
39 |
+
|
40 |
+
# load tagger
|
41 |
+
tagger = SequenceTagger.load("flair/ner-multi")
|
42 |
+
|
43 |
+
# make example sentence in any of the four languages
|
44 |
+
sentence = Sentence("George Washington ging nach Washington")
|
45 |
+
|
46 |
+
# predict NER tags
|
47 |
+
tagger.predict(sentence)
|
48 |
+
|
49 |
+
# print sentence
|
50 |
+
print(sentence)
|
51 |
+
|
52 |
+
# print predicted NER spans
|
53 |
+
print('The following NER tags are found:')
|
54 |
+
# iterate over entities and print
|
55 |
+
for entity in sentence.get_spans('ner'):
|
56 |
+
print(entity)
|
57 |
+
|
58 |
+
```
|
59 |
+
|
60 |
+
This yields the following output:
|
61 |
+
```
|
62 |
+
Span [1,2]: "George Washington" [− Labels: PER (0.9977)]
|
63 |
+
Span [5]: "Washington" [− Labels: LOC (0.9895)]
|
64 |
+
```
|
65 |
+
|
66 |
+
So, the entities "*George Washington*" (labeled as a **person**) and "*Washington*" (labeled as a **location**) are found in the sentence "*George Washington ging nach Washington*".
|
67 |
+
|
68 |
+
|
69 |
+
---
|
70 |
+
|
71 |
+
### Training: Script to train this model
|
72 |
+
|
73 |
+
The following Flair script was used to train this model:
|
74 |
+
|
75 |
+
```python
|
76 |
+
from flair.data import Corpus
|
77 |
+
from flair.datasets import CONLL_03, CONLL_03_GERMAN, CONLL_03_DUTCH, CONLL_03_SPANISH
|
78 |
+
from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
|
79 |
+
|
80 |
+
# 1. get the multi-language corpus
|
81 |
+
corpus: Corpus = MultiCorpus([
|
82 |
+
CONLL_03(), # English corpus
|
83 |
+
CONLL_03_GERMAN(), # German corpus
|
84 |
+
CONLL_03_DUTCH(), # Dutch corpus
|
85 |
+
CONLL_03_SPANISH(), # Spanish corpus
|
86 |
+
])
|
87 |
+
|
88 |
+
# 2. what tag do we want to predict?
|
89 |
+
tag_type = 'ner'
|
90 |
+
|
91 |
+
# 3. make the tag dictionary from the corpus
|
92 |
+
tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
|
93 |
+
|
94 |
+
# 4. initialize each embedding we use
|
95 |
+
embedding_types = [
|
96 |
+
|
97 |
+
# GloVe embeddings
|
98 |
+
WordEmbeddings('glove'),
|
99 |
+
|
100 |
+
# FastText embeddings
|
101 |
+
WordEmbeddings('de'),
|
102 |
+
|
103 |
+
# contextual string embeddings, forward
|
104 |
+
FlairEmbeddings('multi-forward'),
|
105 |
+
|
106 |
+
# contextual string embeddings, backward
|
107 |
+
FlairEmbeddings('multi-backward'),
|
108 |
+
]
|
109 |
+
|
110 |
+
# embedding stack consists of Flair and GloVe embeddings
|
111 |
+
embeddings = StackedEmbeddings(embeddings=embedding_types)
|
112 |
+
|
113 |
+
# 5. initialize sequence tagger
|
114 |
+
from flair.models import SequenceTagger
|
115 |
+
|
116 |
+
tagger = SequenceTagger(hidden_size=256,
|
117 |
+
embeddings=embeddings,
|
118 |
+
tag_dictionary=tag_dictionary,
|
119 |
+
tag_type=tag_type)
|
120 |
+
|
121 |
+
# 6. initialize trainer
|
122 |
+
from flair.trainers import ModelTrainer
|
123 |
+
|
124 |
+
trainer = ModelTrainer(tagger, corpus)
|
125 |
+
|
126 |
+
# 7. run training
|
127 |
+
trainer.train('resources/taggers/ner-multi',
|
128 |
+
train_with_dev=True,
|
129 |
+
max_epochs=150)
|
130 |
+
```
|
131 |
+
|
132 |
+
|
133 |
+
|
134 |
+
---
|
135 |
+
|
136 |
+
### Cite
|
137 |
+
|
138 |
+
Please cite the following paper when using this model.
|
139 |
+
|
140 |
+
```
|
141 |
+
@inproceedings{akbik2018coling,
|
142 |
+
title={Contextual String Embeddings for Sequence Labeling},
|
143 |
+
author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
|
144 |
+
booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
|
145 |
+
pages = {1638--1649},
|
146 |
+
year = {2018}
|
147 |
+
}
|
148 |
+
```
|
loss.tsv
ADDED
@@ -0,0 +1,151 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
EPOCH TIMESTAMP BAD_EPOCHS LEARNING_RATE TRAIN_LOSS TRAIN_PRECISION TRAIN_RECALL TRAIN_ACCURACY TRAIN_F-SCORE DEV_LOSS DEV_PRECISION DEV_RECALL DEV_ACCURACY DEV_F-SCORE TEST_LOSS TEST_PRECISION TEST_RECALL TEST_ACCURACY TEST_F-SCORE
|
2 |
+
0 10:59:00 0 0.1000 2.953475790447307 _ _ _ _ _ _ _ _ _ _ 0.7908 0.7848 0.7878 0.7878
|
3 |
+
1 11:03:26 0 0.1000 1.5613928847477874 _ _ _ _ _ _ _ _ _ _ 0.8328 0.7913 0.8115 0.8115
|
4 |
+
2 11:07:47 0 0.1000 1.3328234553044571 _ _ _ _ _ _ _ _ _ _ 0.8269 0.8103 0.8185 0.8185
|
5 |
+
3 11:12:10 0 0.1000 1.1894718631716301 _ _ _ _ _ _ _ _ _ _ 0.8479 0.8346 0.8412 0.8412
|
6 |
+
4 11:16:31 0 0.1000 1.1186298550029252 _ _ _ _ _ _ _ _ _ _ 0.8536 0.828 0.8406 0.8406
|
7 |
+
5 11:20:59 0 0.1000 1.0496702689508508 _ _ _ _ _ _ _ _ _ _ 0.8618 0.8418 0.8517 0.8517
|
8 |
+
6 11:25:22 0 0.1000 0.9990235921427482 _ _ _ _ _ _ _ _ _ _ 0.854 0.852 0.853 0.853
|
9 |
+
7 11:29:41 0 0.1000 0.961911206602871 _ _ _ _ _ _ _ _ _ _ 0.8673 0.8407 0.8538 0.8538
|
10 |
+
8 11:34:05 0 0.1000 0.9181876736129632 _ _ _ _ _ _ _ _ _ _ 0.8708 0.8539 0.8623 0.8623
|
11 |
+
9 11:38:30 0 0.1000 0.907997776430991 _ _ _ _ _ _ _ _ _ _ 0.8633 0.8567 0.86 0.86
|
12 |
+
10 11:42:57 0 0.1000 0.8613800380307894 _ _ _ _ _ _ _ _ _ _ 0.8679 0.8518 0.8598 0.8598
|
13 |
+
11 11:47:27 0 0.1000 0.849987226690182 _ _ _ _ _ _ _ _ _ _ 0.8631 0.8565 0.8598 0.8598
|
14 |
+
12 11:51:48 0 0.1000 0.830854225634283 _ _ _ _ _ _ _ _ _ _ 0.873 0.8513 0.862 0.862
|
15 |
+
13 11:56:08 0 0.1000 0.8218251601785131 _ _ _ _ _ _ _ _ _ _ 0.8759 0.8565 0.8661 0.8661
|
16 |
+
14 12:00:32 0 0.1000 0.8054646737349248 _ _ _ _ _ _ _ _ _ _ 0.8756 0.8589 0.8672 0.8672
|
17 |
+
15 12:04:53 0 0.1000 0.7794681299046988 _ _ _ _ _ _ _ _ _ _ 0.8746 0.8583 0.8663 0.8664
|
18 |
+
16 12:09:13 0 0.1000 0.7848604690242548 _ _ _ _ _ _ _ _ _ _ 0.8802 0.8594 0.8697 0.8697
|
19 |
+
17 12:13:34 1 0.1000 0.7637730026230019 _ _ _ _ _ _ _ _ _ _ 0.8732 0.8601 0.8666 0.8666
|
20 |
+
18 12:17:54 0 0.1000 0.7584837641142651 _ _ _ _ _ _ _ _ _ _ 0.8809 0.8626 0.8716 0.8717
|
21 |
+
19 12:22:14 0 0.1000 0.7435564011696313 _ _ _ _ _ _ _ _ _ _ 0.8783 0.8606 0.8694 0.8694
|
22 |
+
20 12:26:37 0 0.1000 0.7319077155977514 _ _ _ _ _ _ _ _ _ _ 0.8839 0.8621 0.8728 0.8729
|
23 |
+
21 12:30:57 0 0.1000 0.7107181243636796 _ _ _ _ _ _ _ _ _ _ 0.8647 0.8614 0.863 0.863
|
24 |
+
22 12:35:17 0 0.1000 0.7171132942111534 _ _ _ _ _ _ _ _ _ _ 0.8764 0.8651 0.8707 0.8707
|
25 |
+
23 12:39:40 1 0.1000 0.7009657600265137 _ _ _ _ _ _ _ _ _ _ 0.8738 0.8665 0.8701 0.8701
|
26 |
+
24 12:44:00 0 0.1000 0.6989209016579343 _ _ _ _ _ _ _ _ _ _ 0.8829 0.8594 0.871 0.871
|
27 |
+
25 12:48:21 0 0.1000 0.6951734939474944 _ _ _ _ _ _ _ _ _ _ 0.8827 0.8569 0.8696 0.8696
|
28 |
+
26 12:52:43 0 0.1000 0.6822735162280682 _ _ _ _ _ _ _ _ _ _ 0.8842 0.857 0.8704 0.8704
|
29 |
+
27 12:57:07 0 0.1000 0.6838434803809537 _ _ _ _ _ _ _ _ _ _ 0.8753 0.8681 0.8717 0.8717
|
30 |
+
28 13:01:31 1 0.1000 0.6716522983858122 _ _ _ _ _ _ _ _ _ _ 0.8788 0.861 0.8698 0.8698
|
31 |
+
29 13:05:47 0 0.1000 0.6634576296755379 _ _ _ _ _ _ _ _ _ _ 0.877 0.8626 0.8697 0.8697
|
32 |
+
30 13:10:06 0 0.1000 0.6633003305447108 _ _ _ _ _ _ _ _ _ _ 0.8754 0.8626 0.869 0.869
|
33 |
+
31 13:14:26 0 0.1000 0.6601696083931253 _ _ _ _ _ _ _ _ _ _ 0.8841 0.8606 0.8722 0.8722
|
34 |
+
32 13:18:44 0 0.1000 0.6539714311143616 _ _ _ _ _ _ _ _ _ _ 0.8754 0.8662 0.8707 0.8708
|
35 |
+
33 13:23:01 0 0.1000 0.6405916364668743 _ _ _ _ _ _ _ _ _ _ 0.8819 0.8654 0.8736 0.8736
|
36 |
+
34 13:27:21 0 0.1000 0.642544928475003 _ _ _ _ _ _ _ _ _ _ 0.882 0.864 0.8729 0.8729
|
37 |
+
35 13:31:45 1 0.1000 0.641501997323611 _ _ _ _ _ _ _ _ _ _ 0.8767 0.8663 0.8715 0.8715
|
38 |
+
36 13:36:09 2 0.1000 0.635029950176255 _ _ _ _ _ _ _ _ _ _ 0.877 0.8708 0.8739 0.8739
|
39 |
+
37 13:40:36 0 0.1000 0.6317766632252801 _ _ _ _ _ _ _ _ _ _ 0.8828 0.8678 0.8752 0.8752
|
40 |
+
38 13:44:55 0 0.1000 0.6303625095185952 _ _ _ _ _ _ _ _ _ _ 0.8761 0.8667 0.8714 0.8714
|
41 |
+
39 13:49:12 0 0.1000 0.6248182013889186 _ _ _ _ _ _ _ _ _ _ 0.8877 0.8609 0.8741 0.8741
|
42 |
+
40 13:53:34 0 0.1000 0.6206092986888428 _ _ _ _ _ _ _ _ _ _ 0.8792 0.8651 0.8721 0.8721
|
43 |
+
41 13:57:59 0 0.1000 0.6217844312872125 _ _ _ _ _ _ _ _ _ _ 0.8822 0.8692 0.8756 0.8757
|
44 |
+
42 14:02:20 1 0.1000 0.6284974648759417 _ _ _ _ _ _ _ _ _ _ 0.8747 0.8647 0.8696 0.8697
|
45 |
+
43 14:06:42 2 0.1000 0.613935504287663 _ _ _ _ _ _ _ _ _ _ 0.8793 0.8622 0.8707 0.8707
|
46 |
+
44 14:10:58 0 0.1000 0.6208729479739985 _ _ _ _ _ _ _ _ _ _ 0.8809 0.8699 0.8754 0.8754
|
47 |
+
45 14:15:26 1 0.1000 0.61270951145969 _ _ _ _ _ _ _ _ _ _ 0.8794 0.8614 0.8703 0.8703
|
48 |
+
46 14:19:48 0 0.1000 0.6128666626890906 _ _ _ _ _ _ _ _ _ _ 0.8793 0.8676 0.8734 0.8734
|
49 |
+
47 14:24:18 1 0.1000 0.6043824241760176 _ _ _ _ _ _ _ _ _ _ 0.8854 0.862 0.8736 0.8735
|
50 |
+
48 14:28:40 0 0.1000 0.5973783223691089 _ _ _ _ _ _ _ _ _ _ 0.8801 0.8669 0.8735 0.8735
|
51 |
+
49 14:32:56 0 0.1000 0.6026040092081145 _ _ _ _ _ _ _ _ _ _ 0.8817 0.8623 0.8719 0.8719
|
52 |
+
50 14:37:26 1 0.1000 0.5987746510701575 _ _ _ _ _ _ _ _ _ _ 0.8747 0.868 0.8713 0.8713
|
53 |
+
51 14:41:50 2 0.1000 0.5947843462103743 _ _ _ _ _ _ _ _ _ _ 0.8775 0.8675 0.8724 0.8725
|
54 |
+
52 14:46:16 0 0.1000 0.5919834629086921 _ _ _ _ _ _ _ _ _ _ 0.8813 0.8647 0.8729 0.8729
|
55 |
+
53 14:50:34 0 0.1000 0.5937702719695134 _ _ _ _ _ _ _ _ _ _ 0.8823 0.8702 0.8762 0.8762
|
56 |
+
54 14:54:55 1 0.1000 0.5917879479803156 _ _ _ _ _ _ _ _ _ _ 0.8805 0.8622 0.8713 0.8713
|
57 |
+
55 14:59:18 0 0.1000 0.5933259962781406 _ _ _ _ _ _ _ _ _ _ 0.8819 0.8626 0.8721 0.8721
|
58 |
+
56 15:03:37 1 0.1000 0.5928614687883759 _ _ _ _ _ _ _ _ _ _ 0.8779 0.8681 0.873 0.873
|
59 |
+
57 15:07:55 2 0.1000 0.5851544519694847 _ _ _ _ _ _ _ _ _ _ 0.882 0.8651 0.8735 0.8735
|
60 |
+
58 15:12:12 0 0.1000 0.5841915451073357 _ _ _ _ _ _ _ _ _ _ 0.8788 0.8674 0.8731 0.8731
|
61 |
+
59 15:16:31 0 0.1000 0.59194735543873 _ _ _ _ _ _ _ _ _ _ 0.8793 0.8678 0.8735 0.8735
|
62 |
+
60 15:20:51 1 0.1000 0.5790510222227323 _ _ _ _ _ _ _ _ _ _ 0.8737 0.8687 0.8712 0.8712
|
63 |
+
61 15:25:10 0 0.1000 0.5804060975380279 _ _ _ _ _ _ _ _ _ _ 0.8782 0.8652 0.8717 0.8717
|
64 |
+
62 15:29:39 1 0.1000 0.585388834950364 _ _ _ _ _ _ _ _ _ _ 0.8735 0.867 0.8702 0.8702
|
65 |
+
63 15:33:56 2 0.1000 0.5736780315191722 _ _ _ _ _ _ _ _ _ _ 0.8826 0.8628 0.8726 0.8726
|
66 |
+
64 15:38:19 0 0.1000 0.5671733145606986 _ _ _ _ _ _ _ _ _ _ 0.8769 0.8624 0.8696 0.8696
|
67 |
+
65 15:42:45 0 0.1000 0.5727116274737428 _ _ _ _ _ _ _ _ _ _ 0.8822 0.8609 0.8714 0.8714
|
68 |
+
66 15:47:18 1 0.1000 0.5807550209185521 _ _ _ _ _ _ _ _ _ _ 0.8807 0.8672 0.8739 0.8739
|
69 |
+
67 15:51:42 2 0.1000 0.5776320083779413 _ _ _ _ _ _ _ _ _ _ 0.8787 0.8712 0.875 0.8749
|
70 |
+
68 15:56:01 3 0.1000 0.5778845083474792 _ _ _ _ _ _ _ _ _ _ 0.8781 0.8673 0.8726 0.8727
|
71 |
+
69 16:00:29 0 0.0500 0.5233265276618764 _ _ _ _ _ _ _ _ _ _ 0.881 0.8682 0.8746 0.8746
|
72 |
+
70 16:04:55 0 0.0500 0.5091402300838657 _ _ _ _ _ _ _ _ _ _ 0.8804 0.8699 0.8751 0.8751
|
73 |
+
71 16:09:22 0 0.0500 0.48569875567842435 _ _ _ _ _ _ _ _ _ _ 0.8841 0.869 0.8765 0.8765
|
74 |
+
72 16:13:42 0 0.0500 0.47439900001701624 _ _ _ _ _ _ _ _ _ _ 0.8834 0.8712 0.8773 0.8773
|
75 |
+
73 16:17:59 0 0.0500 0.46478448706398684 _ _ _ _ _ _ _ _ _ _ 0.8816 0.8712 0.8764 0.8764
|
76 |
+
74 16:22:18 0 0.0500 0.4667157404082863 _ _ _ _ _ _ _ _ _ _ 0.8767 0.8718 0.8742 0.8742
|
77 |
+
75 16:26:39 1 0.0500 0.4609185567276217 _ _ _ _ _ _ _ _ _ _ 0.8858 0.8714 0.8785 0.8785
|
78 |
+
76 16:31:02 0 0.0500 0.4479405909302511 _ _ _ _ _ _ _ _ _ _ 0.8851 0.8672 0.8761 0.8761
|
79 |
+
77 16:35:21 0 0.0500 0.4526363062460368 _ _ _ _ _ _ _ _ _ _ 0.8813 0.87 0.8756 0.8756
|
80 |
+
78 16:39:40 1 0.0500 0.4462794515947487 _ _ _ _ _ _ _ _ _ _ 0.8783 0.8696 0.8739 0.8739
|
81 |
+
79 16:44:05 0 0.0500 0.43589636060524034 _ _ _ _ _ _ _ _ _ _ 0.8823 0.8707 0.8765 0.8765
|
82 |
+
80 16:48:24 0 0.0500 0.4365409203967733 _ _ _ _ _ _ _ _ _ _ 0.8838 0.8702 0.877 0.8769
|
83 |
+
81 16:52:43 1 0.0500 0.43502475776572713 _ _ _ _ _ _ _ _ _ _ 0.8792 0.8733 0.8762 0.8762
|
84 |
+
82 16:57:03 0 0.0500 0.4373334978176562 _ _ _ _ _ _ _ _ _ _ 0.882 0.8707 0.8763 0.8763
|
85 |
+
83 17:01:26 1 0.0500 0.4334466782543237 _ _ _ _ _ _ _ _ _ _ 0.8808 0.8683 0.8745 0.8745
|
86 |
+
84 17:05:54 0 0.0500 0.4254087798321586 _ _ _ _ _ _ _ _ _ _ 0.8839 0.87 0.8769 0.8769
|
87 |
+
85 17:10:19 0 0.0500 0.4255044488190453 _ _ _ _ _ _ _ _ _ _ 0.8822 0.8675 0.8748 0.8748
|
88 |
+
86 17:14:47 1 0.0500 0.4202859611302876 _ _ _ _ _ _ _ _ _ _ 0.8817 0.872 0.8768 0.8768
|
89 |
+
87 17:19:05 0 0.0500 0.41523468196793106 _ _ _ _ _ _ _ _ _ _ 0.8786 0.8695 0.874 0.874
|
90 |
+
88 17:23:25 0 0.0500 0.4162545773211675 _ _ _ _ _ _ _ _ _ _ 0.8803 0.8712 0.8757 0.8757
|
91 |
+
89 17:27:55 1 0.0500 0.4111110245408652 _ _ _ _ _ _ _ _ _ _ 0.8788 0.8717 0.8753 0.8752
|
92 |
+
90 17:32:31 0 0.0500 0.4167104086720782 _ _ _ _ _ _ _ _ _ _ 0.8783 0.8687 0.8735 0.8735
|
93 |
+
91 17:37:03 1 0.0500 0.41473309594586694 _ _ _ _ _ _ _ _ _ _ 0.88 0.8707 0.8753 0.8753
|
94 |
+
92 17:41:27 2 0.0500 0.41171511629929425 _ _ _ _ _ _ _ _ _ _ 0.8839 0.8683 0.8761 0.876
|
95 |
+
93 17:45:59 3 0.0500 0.4053407998584393 _ _ _ _ _ _ _ _ _ _ 0.8832 0.8707 0.8769 0.8769
|
96 |
+
94 17:50:21 0 0.0500 0.4051551429164539 _ _ _ _ _ _ _ _ _ _ 0.8825 0.871 0.8767 0.8767
|
97 |
+
95 17:54:42 0 0.0500 0.40289974819591906 _ _ _ _ _ _ _ _ _ _ 0.8832 0.8712 0.8771 0.8772
|
98 |
+
96 17:59:04 0 0.0500 0.3960453512888854 _ _ _ _ _ _ _ _ _ _ 0.881 0.8744 0.8777 0.8777
|
99 |
+
97 18:03:37 0 0.0500 0.4043394380894219 _ _ _ _ _ _ _ _ _ _ 0.8784 0.8728 0.8756 0.8756
|
100 |
+
98 18:07:59 1 0.0500 0.40074241869638155 _ _ _ _ _ _ _ _ _ _ 0.8811 0.8704 0.8757 0.8757
|
101 |
+
99 18:12:20 2 0.0500 0.4009374214579409 _ _ _ _ _ _ _ _ _ _ 0.8828 0.8684 0.8755 0.8755
|
102 |
+
100 18:16:44 3 0.0500 0.39415279716963364 _ _ _ _ _ _ _ _ _ _ 0.8837 0.8751 0.8794 0.8794
|
103 |
+
101 18:21:07 0 0.0500 0.39470267033507134 _ _ _ _ _ _ _ _ _ _ 0.8786 0.8718 0.8752 0.8752
|
104 |
+
102 18:25:27 1 0.0500 0.3907430959372549 _ _ _ _ _ _ _ _ _ _ 0.8818 0.8725 0.8771 0.8771
|
105 |
+
103 18:29:48 0 0.0500 0.3938351009347457 _ _ _ _ _ _ _ _ _ _ 0.8817 0.8707 0.8762 0.8762
|
106 |
+
104 18:34:16 1 0.0500 0.38270939616572686 _ _ _ _ _ _ _ _ _ _ 0.8821 0.8677 0.8749 0.8748
|
107 |
+
105 18:38:37 0 0.0500 0.3885159160635453 _ _ _ _ _ _ _ _ _ _ 0.883 0.8727 0.8778 0.8778
|
108 |
+
106 18:43:10 1 0.0500 0.3883291266625618 _ _ _ _ _ _ _ _ _ _ 0.886 0.8705 0.8782 0.8782
|
109 |
+
107 18:47:40 2 0.0500 0.38669439267632816 _ _ _ _ _ _ _ _ _ _ 0.8839 0.8701 0.8769 0.8769
|
110 |
+
108 18:52:10 3 0.0500 0.38662982775500127 _ _ _ _ _ _ _ _ _ _ 0.8825 0.8705 0.8765 0.8765
|
111 |
+
109 18:56:31 0 0.0250 0.3666808893379931 _ _ _ _ _ _ _ _ _ _ 0.8832 0.8721 0.8776 0.8776
|
112 |
+
110 19:00:55 0 0.0250 0.3527461102972647 _ _ _ _ _ _ _ _ _ _ 0.8798 0.8733 0.8766 0.8765
|
113 |
+
111 19:05:19 0 0.0250 0.3453178777930988 _ _ _ _ _ _ _ _ _ _ 0.8846 0.8704 0.8774 0.8774
|
114 |
+
112 19:09:39 0 0.0250 0.3501398392813719 _ _ _ _ _ _ _ _ _ _ 0.8822 0.873 0.8776 0.8776
|
115 |
+
113 19:14:09 1 0.0250 0.3436550526062968 _ _ _ _ _ _ _ _ _ _ 0.8858 0.8725 0.8791 0.8791
|
116 |
+
114 19:18:46 0 0.0250 0.3465682747716397 _ _ _ _ _ _ _ _ _ _ 0.8839 0.873 0.8784 0.8784
|
117 |
+
115 19:23:17 1 0.0250 0.3389248694715256 _ _ _ _ _ _ _ _ _ _ 0.8843 0.8736 0.8789 0.8789
|
118 |
+
116 19:27:41 0 0.0250 0.3388266236616183 _ _ _ _ _ _ _ _ _ _ 0.8823 0.8742 0.8782 0.8782
|
119 |
+
117 19:32:02 0 0.0250 0.3426530549034204 _ _ _ _ _ _ _ _ _ _ 0.8824 0.8717 0.877 0.877
|
120 |
+
118 19:36:36 1 0.0250 0.33444241207663017 _ _ _ _ _ _ _ _ _ _ 0.8824 0.8726 0.8775 0.8775
|
121 |
+
119 19:41:13 0 0.0250 0.3356248891977349 _ _ _ _ _ _ _ _ _ _ 0.8833 0.8704 0.8768 0.8768
|
122 |
+
120 19:45:40 1 0.0250 0.3277514071869546 _ _ _ _ _ _ _ _ _ _ 0.8822 0.8754 0.8788 0.8788
|
123 |
+
121 19:50:00 0 0.0250 0.3324392681232043 _ _ _ _ _ _ _ _ _ _ 0.8829 0.8739 0.8784 0.8784
|
124 |
+
122 19:54:18 1 0.0250 0.32520957710642356 _ _ _ _ _ _ _ _ _ _ 0.8853 0.874 0.8796 0.8796
|
125 |
+
123 19:58:38 0 0.0250 0.3207463735829413 _ _ _ _ _ _ _ _ _ _ 0.8826 0.8746 0.8786 0.8786
|
126 |
+
124 20:02:57 0 0.0250 0.32199384285757643 _ _ _ _ _ _ _ _ _ _ 0.8843 0.8738 0.879 0.879
|
127 |
+
125 20:07:22 1 0.0250 0.32214300781759714 _ _ _ _ _ _ _ _ _ _ 0.8851 0.8716 0.8783 0.8783
|
128 |
+
126 20:11:41 2 0.0250 0.3181642439872496 _ _ _ _ _ _ _ _ _ _ 0.8848 0.873 0.8788 0.8789
|
129 |
+
127 20:15:58 0 0.0250 0.31885623830727194 _ _ _ _ _ _ _ _ _ _ 0.8833 0.8725 0.8779 0.8779
|
130 |
+
128 20:20:16 1 0.0250 0.31512833852472777 _ _ _ _ _ _ _ _ _ _ 0.8824 0.8718 0.8771 0.8771
|
131 |
+
129 20:24:36 0 0.0250 0.3152116099719734 _ _ _ _ _ _ _ _ _ _ 0.8867 0.873 0.8798 0.8798
|
132 |
+
130 20:28:55 1 0.0250 0.32019182619040343 _ _ _ _ _ _ _ _ _ _ 0.8847 0.8711 0.8779 0.8778
|
133 |
+
131 20:33:19 2 0.0250 0.3163907725520554 _ _ _ _ _ _ _ _ _ _ 0.8819 0.8723 0.877 0.8771
|
134 |
+
132 20:37:36 3 0.0250 0.3078251098573024 _ _ _ _ _ _ _ _ _ _ 0.8842 0.8717 0.8779 0.8779
|
135 |
+
133 20:42:02 0 0.0250 0.3111053387984559 _ _ _ _ _ _ _ _ _ _ 0.8839 0.8725 0.8781 0.8782
|
136 |
+
134 20:46:23 1 0.0250 0.3092448969837757 _ _ _ _ _ _ _ _ _ _ 0.8845 0.8732 0.8788 0.8788
|
137 |
+
135 20:50:44 2 0.0250 0.3134185765273288 _ _ _ _ _ _ _ _ _ _ 0.8838 0.8734 0.8785 0.8786
|
138 |
+
136 20:55:02 3 0.0250 0.3033614675062414 _ _ _ _ _ _ _ _ _ _ 0.8824 0.8723 0.8773 0.8773
|
139 |
+
137 20:59:22 0 0.0250 0.3163979164883878 _ _ _ _ _ _ _ _ _ _ 0.882 0.8731 0.8776 0.8775
|
140 |
+
138 21:03:39 1 0.0250 0.30996306355280934 _ _ _ _ _ _ _ _ _ _ 0.8818 0.8724 0.8771 0.8771
|
141 |
+
139 21:07:56 2 0.0250 0.30494013050054886 _ _ _ _ _ _ _ _ _ _ 0.8838 0.8712 0.8775 0.8775
|
142 |
+
140 21:12:21 3 0.0250 0.3057546090411619 _ _ _ _ _ _ _ _ _ _ 0.8857 0.8717 0.8787 0.8786
|
143 |
+
141 21:16:41 0 0.0125 0.2974806412130884 _ _ _ _ _ _ _ _ _ _ 0.8852 0.8742 0.8796 0.8797
|
144 |
+
142 21:21:00 0 0.0125 0.2924100736045671 _ _ _ _ _ _ _ _ _ _ 0.8826 0.8733 0.8779 0.8779
|
145 |
+
143 21:25:17 0 0.0125 0.28917630017535056 _ _ _ _ _ _ _ _ _ _ 0.8841 0.8744 0.8792 0.8792
|
146 |
+
144 21:29:37 0 0.0125 0.2891165876694287 _ _ _ _ _ _ _ _ _ _ 0.8855 0.8749 0.8801 0.8802
|
147 |
+
145 21:33:56 0 0.0125 0.2874728485910039 _ _ _ _ _ _ _ _ _ _ 0.884 0.8732 0.8786 0.8786
|
148 |
+
146 21:38:13 0 0.0125 0.28690377793817484 _ _ _ _ _ _ _ _ _ _ 0.8847 0.8717 0.8782 0.8782
|
149 |
+
147 21:42:38 0 0.0125 0.2853494226248391 _ _ _ _ _ _ _ _ _ _ 0.8851 0.874 0.8795 0.8795
|
150 |
+
148 21:47:06 0 0.0125 0.282234717166538 _ _ _ _ _ _ _ _ _ _ 0.8853 0.8744 0.8798 0.8798
|
151 |
+
149 21:51:23 0 0.0125 0.278328151237858 _ _ _ _ _ _ _ _ _ _ 0.8849 0.8734 0.8791 0.8791
|
pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:d81cca5b24f295c7a09b36c37353c55d528960fb6218ccb5d6b71c0bc98dfd0e
|
3 |
+
size 1512864032
|
training.log
ADDED
The diff for this file is too large to render.
See raw diff
|
|