sofiaoliveira
commited on
Commit
•
8af257e
1
Parent(s):
74a61aa
Update README.md
Browse files
README.md
CHANGED
@@ -59,15 +59,9 @@ To use the full SRL model (transformers portion + a decoding layer), refer to th
|
|
59 |
- The models were trained only for 5 epochs.
|
60 |
- The English data was preprocessed to match the Portuguese data, so there are some differences in role attributions and some roles were removed from the data.
|
61 |
|
62 |
-
|
63 |
-
## Training data
|
64 |
-
|
65 |
-
Pretrained weights were left identical to the original model [`bert-base-multilingual-cased`](https://huggingface.co/bert-base-multilingual-cased).
|
66 |
-
|
67 |
-
|
68 |
## Training procedure
|
69 |
|
70 |
-
The
|
71 |
|
72 |
## Eval results
|
73 |
|
|
|
59 |
- The models were trained only for 5 epochs.
|
60 |
- The English data was preprocessed to match the Portuguese data, so there are some differences in role attributions and some roles were removed from the data.
|
61 |
|
|
|
|
|
|
|
|
|
|
|
|
|
62 |
## Training procedure
|
63 |
|
64 |
+
The model was trained on the CoNLL-2012 dataset, preprocessed to match the Portuguese PropBank.Br data. They were tested on the PropBank.Br data set as well as on a smaller opinion dataset "Buscapé". For more information, please see the accompanying article (See BibTeX entry and citation info below) and the [project's github](https://github.com/asofiaoliveira/srl_bert_pt).
|
65 |
|
66 |
## Eval results
|
67 |
|