|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- mbruton/spanish_srl |
|
- CoNLL-2012 |
|
- PropBank.Br |
|
language: |
|
- es |
|
- en |
|
- pt |
|
metrics: |
|
- seqeval |
|
library_name: transformers |
|
pipeline_tag: token-classification |
|
--- |
|
|
|
# Model Card for SpaBERT for Semantic Role Labeling (cased) |
|
|
|
This model is fine-tuned on a version of [multilingual BERT](https://huggingface.co/bert-base-multilingual-cased) which is pre-trained on the SRL task for English and Portuguese, and is one of 24 models introduced as part of [this project](https://github.com/mbruton0426/GalicianSRL). |
|
|
|
## Model Details |
|
|
|
### Model Description |
|
|
|
SpaBERT for Semantic Role Labeling (SRL) is a transformers model, leveraging XLM-R's extensive pretraining on 100 languages to achieve better SRL predictions for Spanish. This model is additionally pre-trained on the SRL task for English and Portuguese. It was fine-tuned on Spanish with the following objectives: |
|
|
|
- Identify up to 16 verbal roots within a sentence. |
|
- Identify available arguments and thematic roles for each verbal root. |
|
|
|
Labels are formatted as: r#:tag, where r# links the token to a specific verbal root of index #, and tag identifies the token as the verbal root (root) or an individual argument (arg0/arg1/arg2/arg3/argM) and it's thematic role (adv/agt/atr/ben/cau/cot/des/efi/ein/exp/ext/fin/ins/loc/mnr/ori/pat/src/tem/tmp) |
|
|
|
- **Developed by:** [Micaella Bruton](mailto:micaellabruton@gmail.com) |
|
- **Model type:** Transformers |
|
- **Language(s) (NLP):** Spanish (es), English (en), Portuguese (pt) |
|
- **License:** Apache 2.0 |
|
- **Finetuned from model:** [English & Portuguese pre-trained multilingual BERT](https://huggingface.co/liaad/srl-enpt_mbert-base) |
|
|
|
### Model Sources |
|
|
|
- **Repository:** [GalicianSRL](https://github.com/mbruton0426/GalicianSRL) |
|
- **Paper:** To be updated |
|
|
|
## Uses |
|
|
|
This model is intended to be used to develop and improve natural language processing tools for Spanish. |
|
|
|
## Bias, Risks, and Limitations |
|
|
|
The Spanish training set lacked highly complex sentences and as such, performs much better on sentences of mid- to low-complexity. |
|
|
|
## Training Details |
|
|
|
### Training Data |
|
|
|
This model was pre-trained on the [OntoNotes 5.0 English SRL corpus](http://catalog.ldc.upenn.edu/LDC2013T19) and [PropBank.Br Portuguese SRL corpus](http://www.nilc.icmc.usp.br/portlex/index.php/en/projects/propbankbringl). |
|
This model was fine-tuned on the "train" portion of the [SpanishSRL Dataset](https://huggingface.co/datasets/mbruton/spanish_srl) produced as part of this same project. |
|
|
|
#### Training Hyperparameters |
|
|
|
- **Learning Rate:** 2e-5 |
|
- **Batch Size:** 16 |
|
- **Weight Decay:** 0.01 |
|
- **Early Stopping:** 10 epochs |
|
|
|
## Evaluation |
|
|
|
#### Testing Data |
|
|
|
This model was tested on the "test" portion of the [SpanishSRL Dataset](https://huggingface.co/datasets/mbruton/spanish_srl) produced as part of this same project. |
|
|
|
#### Metrics |
|
|
|
[seqeval](https://huggingface.co/spaces/evaluate-metric/seqeval) is a Python framework for sequence labeling evaluation. It can evaluate the performance of chunking tasks such as named-entity recognition, part-of-speech tagging, and semantic role labeling. |
|
It supplies scoring both overall and per label type. |
|
|
|
Overall: |
|
- `accuracy`: the average [accuracy](https://huggingface.co/metrics/accuracy), on a scale between 0.0 and 1.0. |
|
- `precision`: the average [precision](https://huggingface.co/metrics/precision), on a scale between 0.0 and 1.0. |
|
- `recall`: the average [recall](https://huggingface.co/metrics/recall), on a scale between 0.0 and 1.0. |
|
- `f1`: the average [F1 score](https://huggingface.co/metrics/f1), which is the harmonic mean of the precision and recall. It also has a scale of 0.0 to 1.0. |
|
|
|
Per label type: |
|
- `precision`: the average [precision](https://huggingface.co/metrics/precision), on a scale between 0.0 and 1.0. |
|
- `recall`: the average [recall](https://huggingface.co/metrics/recall), on a scale between 0.0 and 1.0. |
|
- `f1`: the average [F1 score](https://huggingface.co/metrics/f1), on a scale between 0.0 and 1.0. |
|
|
|
### Results |
|
|
|
| Label | Precision | Recall | f1-score | Support | |
|
| :----------: | :-------: | :----: | :------: | :-----: | |
|
| 0:arg0:agt | 0.93 | 0.90 | 0.92 | 867 | |
|
| 0:arg0:cau | 0.64 | 0.68 | 0.66 | 57 | |
|
| 0:arg0:src | 0.00 | 0.00 | 0.00 | 1 | |
|
| 0:arg1:ext | 0.00 | 0.00 | 0.00 | 3 | |
|
| 0:arg1:pat | 0.87 | 0.88 | 0.87 | 536 | |
|
| 0:arg1:tem | 0.88 | 0.88 | 0.88 | 589 | |
|
| 0:arg2:atr | 0.86 | 0.91 | 0.88 | 278 | |
|
| 0:arg2:ben | 0.75 | 0.86 | 0.80 | 78 | |
|
| 0:arg2:efi | 0.71 | 0.71 | 0.71 | 7 | |
|
| 0:arg2:exp | 0.00 | 0.00 | 0.00 | 6 | |
|
| 0:arg2:ext | 0.44 | 0.53 | 0.48 | 15 | |
|
| 0:arg2:loc | 0.59 | 0.56 | 0.58 | 57 | |
|
| 0:arg3:ben | 1.00 | 0.20 | 0.33 | 5 | |
|
| 0:arg3:ein | 0.00 | 0.00 | 0.00 | 1 | |
|
| 0:arg3:fin | 0.50 | 0.50 | 0.50 | 2 | |
|
| 0:arg3:ori | 0.55 | 0.60 | 0.57 | 10 | |
|
| 0:arg4:des | 0.52 | 0.81 | 0.63 | 16 | |
|
| 0:arg4:efi | 0.25 | 0.20 | 0.22 | 5 | |
|
| 0:argM:adv | 0.67 | 0.53 | 0.59 | 268 | |
|
| 0:argM:atr | 0.57 | 0.50 | 0.53 | 24 | |
|
| 0:argM:cau | 0.64 | 0.44 | 0.52 | 41 | |
|
| 0:argM:ext | 0.00 | 0.00 | 0.00 | 5 | |
|
| 0:argM:fin | 0.77 | 0.78 | 0.77 | 46 | |
|
| 0:argM:loc | 0.72 | 0.77 | 0.75 | 186 | |
|
| 0:argM:mnr | 0.62 | 0.62 | 0.62 | 66 | |
|
| 0:argM:tmp | 0.84 | 0.86 | 0.85 | 411 | |
|
| 0:root | 0.99 | 0.99 | 0.99 | 1662 | |
|
| 1:arg0:agt | 0.93 | 0.90 | 0.91 | 564 | |
|
| 1:arg0:cau | 0.81 | 0.77 | 0.79 | 44 | |
|
| 1:arg1:ext | 0.00 | 0.00 | 0.00 | 2 | |
|
| 1:arg1:pat | 0.85 | 0.90 | 0.87 | 482 | |
|
| 1:arg1:tem | 0.89 | 0.88 | 0.88 | 390 | |
|
| 1:arg2:atr | 0.83 | 0.89 | 0.86 | 197 | |
|
| 1:arg2:ben | 0.71 | 0.83 | 0.76 | 66 | |
|
| 1:arg2:efi | 0.67 | 0.33 | 0.44 | 6 | |
|
| 1:arg2:ext | 0.57 | 0.57 | 0.57 | 7 | |
|
| 1:arg2:ins | 0.00 | 0.00 | 0.00 | 1 | |
|
| 1:arg2:loc | 0.48 | 0.48 | 0.48 | 44 | |
|
| 1:arg3:ben | 0.00 | 0.00 | 0.00 | 2 | |
|
| 1:arg3:ein | 0.00 | 0.00 | 0.00 | 3 | |
|
| 1:arg3:fin | 1.00 | 1.00 | 1.00 | 2 | |
|
| 1:arg3:ori | 0.12 | 0.50 | 0.20 | 2 | |
|
| 1:arg4:des | 0.50 | 0.90 | 0.64 | 10 | |
|
| 1:arg4:efi | 0.00 | 0.00 | 0.00 | 2 | |
|
| 1:argM:adv | 0.67 | 0.49 | 0.57 | 220 | |
|
| 1:argM:atr | 0.65 | 0.58 | 0.61 | 19 | |
|
| 1:argM:cau | 0.58 | 0.74 | 0.65 | 35 | |
|
| 1:argM:ext | 0.33 | 0.14 | 0.20 | 7 | |
|
| 1:argM:fin | 0.54 | 0.74 | 0.62 | 38 | |
|
| 1:argM:loc | 0.66 | 0.77 | 0.71 | 156 | |
|
| 1:argM:mnr | 0.60 | 0.48 | 0.53 | 44 | |
|
| 1:argM:tmp | 0.78 | 0.83 | 0.80 | 247 | |
|
| 1:root | 0.96 | 0.96 | 0.96 | 1323 | |
|
| 2:arg0:agt | 0.86 | 0.88 | 0.87 | 336 | |
|
| 2:arg0:cau | 0.78 | 0.71 | 0.75 | 35 | |
|
| 2:arg0:exp | 0.00 | 0.00 | 0.00 | 1 | |
|
| 2:arg0:src | 0.00 | 0.00 | 0.00 | 1 | |
|
| 2:arg1:pat | 0.82 | 0.85 | 0.83 | 333 | |
|
| 2:arg1:tem | 0.85 | 0.84 | 0.84 | 291 | |
|
| 2:arg2:atr | 0.83 | 0.85 | 0.84 | 124 | |
|
| 2:arg2:ben | 0.69 | 0.79 | 0.74 | 43 | |
|
| 2:arg2:efi | 0.67 | 0.44 | 0.53 | 9 | |
|
| 2:arg2:ext | 0.25 | 0.20 | 0.22 | 5 | |
|
| 2:arg2:ins | 0.00 | 0.00 | 0.00 | 1 | |
|
| 2:arg2:loc | 0.42 | 0.63 | 0.51 | 27 | |
|
| 2:arg3:ben | 0.00 | 0.00 | 0.00 | 4 | |
|
| 2:arg3:ein | 0.00 | 0.00 | 0.00 | 1 | |
|
| 2:arg3:ori | 0.43 | 1.00 | 0.60 | 3 | |
|
| 2:arg4:des | 0.60 | 0.75 | 0.67 | 16 | |
|
| 2:arg4:efi | 0.00 | 0.00 | 0.00 | 6 | |
|
| 2:argM:adv | 0.52 | 0.46 | 0.49 | 176 | |
|
| 2:argM:atr | 0.58 | 0.47 | 0.52 | 15 | |
|
| 2:argM:cau | 0.50 | 0.59 | 0.54 | 17 | |
|
| 2:argM:ext | 0.00 | 0.00 | 0.00 | 4 | |
|
| 2:argM:fin | 0.74 | 0.69 | 0.71 | 36 | |
|
| 2:argM:ins | 0.00 | 0.00 | 0.00 | 1 | |
|
| 2:argM:loc | 0.67 | 0.70 | 0.68 | 117 | |
|
| 2:argM:mnr | 0.44 | 0.31 | 0.37 | 35 | |
|
| 2:argM:tmp | 0.74 | 0.77 | 0.76 | 161 | |
|
| 2:root | 0.93 | 0.93 | 0.93 | 913 | |
|
| 3:arg0:agt | 0.86 | 0.81 | 0.84 | 227 | |
|
| 3:arg0:cau | 0.69 | 0.64 | 0.67 | 14 | |
|
| 3:arg1:pat | 0.81 | 0.83 | 0.82 | 199 | |
|
| 3:arg1:tem | 0.71 | 0.81 | 0.76 | 160 | |
|
| 3:arg2:atr | 0.73 | 0.81 | 0.77 | 79 | |
|
| 3:arg2:ben | 0.75 | 0.78 | 0.76 | 27 | |
|
| 3:arg2:efi | 0.00 | 0.00 | 0.00 | 1 | |
|
| 3:arg2:ext | 0.00 | 0.00 | 0.00 | 3 | |
|
| 3:arg2:loc | 0.45 | 0.43 | 0.44 | 21 | |
|
| 3:arg3:ben | 0.00 | 0.00 | 0.00 | 3 | |
|
| 3:arg3:ein | 0.00 | 0.00 | 0.00 | 2 | |
|
| 3:arg3:ori | 0.00 | 0.00 | 0.00 | 3 | |
|
| 3:arg4:des | 0.40 | 0.86 | 0.55 | 7 | |
|
| 3:arg4:efi | 0.00 | 0.00 | 0.00 | 5 | |
|
| 3:argM:adv | 0.54 | 0.44 | 0.49 | 98 | |
|
| 3:argM:atr | 0.00 | 0.00 | 0.00 | 7 | |
|
| 3:argM:cau | 0.60 | 0.46 | 0.52 | 13 | |
|
| 3:argM:ext | 0.00 | 0.00 | 0.00 | 1 | |
|
| 3:argM:fin | 0.42 | 0.67 | 0.51 | 15 | |
|
| 3:argM:loc | 0.57 | 0.57 | 0.57 | 69 | |
|
| 3:argM:mnr | 0.23 | 0.27 | 0.25 | 11 | |
|
| 3:argM:tmp | 0.80 | 0.72 | 0.75 | 92 | |
|
| 3:root | 0.90 | 0.90 | 0.90 | 569 | |
|
| 4:arg0:agt | 0.77 | 0.82 | 0.80 | 119 | |
|
| 4:arg0:cau | 0.60 | 0.50 | 0.55 | 6 | |
|
| 4:arg1:pat | 0.70 | 0.80 | 0.75 | 87 | |
|
| 4:arg1:tem | 0.79 | 0.64 | 0.71 | 109 | |
|
| 4:arg2:atr | 0.70 | 0.79 | 0.74 | 53 | |
|
| 4:arg2:ben | 0.64 | 0.64 | 0.64 | 11 | |
|
| 4:arg2:ext | 0.00 | 0.00 | 0.00 | 1 | |
|
| 4:arg2:loc | 0.86 | 0.55 | 0.67 | 11 | |
|
| 4:arg3:ein | 0.00 | 0.00 | 0.00 | 1 | |
|
| 4:arg3:ori | 0.00 | 0.00 | 0.00 | 1 | |
|
| 4:arg4:des | 0.83 | 0.50 | 0.62 | 10 | |
|
| 4:arg4:efi | 0.00 | 0.00 | 0.00 | 1 | |
|
| 4:argM:adv | 0.47 | 0.48 | 0.48 | 50 | |
|
| 4:argM:atr | 0.00 | 0.00 | 0.00 | 4 | |
|
| 4:argM:cau | 0.00 | 0.00 | 0.00 | 3 | |
|
| 4:argM:ext | 0.00 | 0.00 | 0.00 | 1 | |
|
| 4:argM:fin | 0.36 | 0.36 | 0.36 | 11 | |
|
| 4:argM:loc | 0.54 | 0.88 | 0.67 | 24 | |
|
| 4:argM:mnr | 1.00 | 0.25 | 0.40 | 16 | |
|
| 4:argM:tmp | 0.70 | 0.63 | 0.67 | 52 | |
|
| 4:root | 0.83 | 0.84 | 0.83 | 322 | |
|
| 5:arg0:agt | 0.71 | 0.78 | 0.74 | 72 | |
|
| 5:arg0:cau | 1.00 | 0.20 | 0.33 | 5 | |
|
| 5:arg1:pat | 0.63 | 0.79 | 0.70 | 71 | |
|
| 5:arg1:tem | 0.69 | 0.49 | 0.57 | 41 | |
|
| 5:arg2:atr | 0.38 | 0.48 | 0.43 | 21 | |
|
| 5:arg2:ben | 0.33 | 0.67 | 0.44 | 6 | |
|
| 5:arg2:efi | 0.00 | 0.00 | 0.00 | 1 | |
|
| 5:arg2:ext | 0.00 | 0.00 | 0.00 | 1 | |
|
| 5:arg2:loc | 0.00 | 0.00 | 0.00 | 1 | |
|
| 5:arg3:ein | 0.00 | 0.00 | 0.00 | 1 | |
|
| 5:arg4:des | 0.50 | 1.00 | 0.67 | 1 | |
|
| 5:arg4:efi | 0.00 | 0.00 | 0.00 | 1 | |
|
| 5:argM:adv | 0.39 | 0.46 | 0.42 | 26 | |
|
| 5:argM:cau | 1.00 | 0.33 | 0.50 | 3 | |
|
| 5:argM:fin | 0.33 | 0.40 | 0.36 | 5 | |
|
| 5:argM:loc | 0.73 | 0.52 | 0.61 | 21 | |
|
| 5:argM:mnr | 0.00 | 0.00 | 0.00 | 7 | |
|
| 5:argM:tmp | 0.58 | 0.70 | 0.64 | 30 | |
|
| 5:root | 0.74 | 0.75 | 0.74 | 173 | |
|
| 6:arg0:agt | 0.62 | 0.53 | 0.57 | 34 | |
|
| 6:arg0:cau | 0.00 | 0.00 | 0.00 | 1 | |
|
| 6:arg1:loc | 0.00 | 0.00 | 0.00 | 1 | |
|
| 6:arg1:pat | 0.47 | 0.50 | 0.48 | 28 | |
|
| 6:arg1:tem | 0.56 | 0.56 | 0.56 | 16 | |
|
| 6:arg2:atr | 0.17 | 0.23 | 0.19 | 13 | |
|
| 6:arg2:ben | 0.00 | 0.00 | 0.00 | 5 | |
|
| 6:arg2:loc | 0.00 | 0.00 | 0.00 | 1 | |
|
| 6:arg3:ben | 0.00 | 0.00 | 0.00 | 1 | |
|
| 6:argM:adv | 0.15 | 0.40 | 0.22 | 10 | |
|
| 6:argM:atr | 0.00 | 0.00 | 0.00 | 2 | |
|
| 6:argM:cau | 0.00 | 0.00 | 0.00 | 1 | |
|
| 6:argM:fin | 0.00 | 0.00 | 0.00 | 2 | |
|
| 6:argM:loc | 0.29 | 0.71 | 0.42 | 7 | |
|
| 6:argM:mnr | 0.00 | 0.00 | 0.00 | 5 | |
|
| 6:argM:tmp | 0.15 | 0.29 | 0.20 | 7 | |
|
| 6:root | 0.68 | 0.62 | 0.65 | 82 | |
|
| 7:arg0:agt | 0.26 | 0.53 | 0.35 | 17 | |
|
| 7:arg1:pat | 0.25 | 0.29 | 0.27 | 17 | |
|
| 7:arg1:tem | 0.36 | 0.53 | 0.43 | 15 | |
|
| 7:arg2:atr | 0.17 | 0.13 | 0.15 | 15 | |
|
| 7:arg2:ben | 0.00 | 0.00 | 0.00 | 7 | |
|
| 7:arg2:loc | 0.00 | 0.00 | 0.00 | 1 | |
|
| 7:arg3:ori | 0.00 | 0.00 | 0.00 | 1 | |
|
| 7:arg4:des | 0.00 | 0.00 | 0.00 | 1 | |
|
| 7:argM:adv | 0.00 | 0.00 | 0.00 | 5 | |
|
| 7:argM:atr | 0.00 | 0.00 | 0.00 | 1 | |
|
| 7:argM:fin | 0.00 | 0.00 | 0.00 | 1 | |
|
| 7:argM:loc | 0.00 | 0.00 | 0.00 | 3 | |
|
| 7:argM:tmp | 0.00 | 0.00 | 0.00 | 6 | |
|
| 7:root | 0.64 | 0.64 | 0.64 | 45 | |
|
| 8:arg0:agt | 0.00 | 0.00 | 0.00 | 8 | |
|
| 8:arg0:cau | 0.00 | 0.00 | 0.00 | 1 | |
|
| 8:arg1:pat | 0.00 | 0.00 | 0.00 | 4 | |
|
| 8:arg1:tem | 0.00 | 0.00 | 0.00 | 9 | |
|
| 8:arg2:atr | 0.00 | 0.00 | 0.00 | 4 | |
|
| 8:arg2:ext | 0.00 | 0.00 | 0.00 | 1 | |
|
| 8:arg2:loc | 0.00 | 0.00 | 0.00 | 2 | |
|
| 8:arg3:ori | 0.00 | 0.00 | 0.00 | 1 | |
|
| 8:argM:adv | 0.00 | 0.00 | 0.00 | 8 | |
|
| 8:argM:ext | 0.00 | 0.00 | 0.00 | 1 | |
|
| 8:argM:fin | 0.00 | 0.00 | 0.00 | 1 | |
|
| 8:argM:loc | 0.00 | 0.00 | 0.00 | 4 | |
|
| 8:argM:mnr | 0.00 | 0.00 | 0.00 | 1 | |
|
| 8:argM:tmp | 0.00 | 0.00 | 0.00 | 1 | |
|
| 8:root | 0.38 | 0.68 | 0.49 | 25 | |
|
| 9:arg0:agt | 0.00 | 0.00 | 0.00 | 6 | |
|
| 9:arg0:cau | 0.00 | 0.00 | 0.00 | 1 | |
|
| 9:arg1:pat | 0.00 | 0.00 | 0.00 | 4 | |
|
| 9:arg1:tem | 0.00 | 0.00 | 0.00 | 5 | |
|
| 9:arg2:atr | 0.00 | 0.00 | 0.00 | 3 | |
|
| 9:arg2:ben | 0.00 | 0.00 | 0.00 | 1 | |
|
| 9:argM:adv | 0.00 | 0.00 | 0.00 | 6 | |
|
| 9:argM:cau | 0.00 | 0.00 | 0.00 | 1 | |
|
| 9:argM:fin | 0.00 | 0.00 | 0.00 | 2 | |
|
| 9:argM:loc | 0.00 | 0.00 | 0.00 | 2 | |
|
| 9:argM:tmp | 0.00 | 0.00 | 0.00 | 1 | |
|
| 9:root | 0.00 | 0.00 | 0.00 | 17 | |
|
| 10:arg0:agt | 0.00 | 0.00 | 0.00 | 3 | |
|
| 10:arg1:pat | 0.00 | 0.00 | 0.00 | 5 | |
|
| 10:arg1:tem | 0.00 | 0.00 | 0.00 | 3 | |
|
| 10:arg2:atr | 0.00 | 0.00 | 0.00 | 1 | |
|
| 10:arg2:ben | 0.00 | 0.00 | 0.00 | 2 | |
|
| 10:argM:adv | 0.00 | 0.00 | 0.00 | 3 | |
|
| 10:argM:fin | 0.00 | 0.00 | 0.00 | 1 | |
|
| 10:argM:tmp | 0.00 | 0.00 | 0.00 | 1 | |
|
| 10:root | 0.00 | 0.00 | 0.00 | 12 | |
|
| 11:arg0:agt | 0.00 | 0.00 | 0.00 | 1 | |
|
| 11:arg0:cau | 0.00 | 0.00 | 0.00 | 1 | |
|
| 11:arg1:pat | 0.00 | 0.00 | 0.00 | 2 | |
|
| 11:arg1:tem | 0.00 | 0.00 | 0.00 | 4 | |
|
| 11:arg2:atr | 0.00 | 0.00 | 0.00 | 3 | |
|
| 11:arg2:ben | 0.00 | 0.00 | 0.00 | 1 | |
|
| 11:argM:adv | 0.00 | 0.00 | 0.00 | 4 | |
|
| 11:argM:loc | 0.00 | 0.00 | 0.00 | 1 | |
|
| 11:argM:tmp | 0.00 | 0.00 | 0.00 | 1 | |
|
| 11:root | 0.00 | 0.00 | 0.00 | 9 | |
|
| 12:arg0:agt | 0.00 | 0.00 | 0.00 | 3 | |
|
| 12:arg1:pat | 0.00 | 0.00 | 0.00 | 1 | |
|
| 12:arg1:tem | 0.00 | 0.00 | 0.00 | 2 | |
|
| 12:arg2:atr | 0.00 | 0.00 | 0.00 | 2 | |
|
| 12:argM:adv | 0.00 | 0.00 | 0.00 | 1 | |
|
| 12:argM:cau | 0.00 | 0.00 | 0.00 | 1 | |
|
| 12:argM:tmp | 0.00 | 0.00 | 0.00 | 3 | |
|
| 12:root | 0.00 | 0.00 | 0.00 | 7 | |
|
| 13:arg0:cau | 0.00 | 0.00 | 0.00 | 1 | |
|
| 13:arg1:tem | 0.00 | 0.00 | 0.00 | 1 | |
|
| 13:arg2:atr | 0.00 | 0.00 | 0.00 | 1 | |
|
| 13:argM:adv | 0.00 | 0.00 | 0.00 | 1 | |
|
| 13:argM:atr | 0.00 | 0.00 | 0.00 | 1 | |
|
| 13:argM:loc | 0.00 | 0.00 | 0.00 | 1 | |
|
| 13:root | 0.00 | 0.00 | 0.00 | 4 | |
|
| 14:arg1:pat | 0.00 | 0.00 | 0.00 | 1 | |
|
| 14:arg2:ben | 0.00 | 0.00 | 0.00 | 1 | |
|
| 14:argM:mnr | 0.00 | 0.00 | 0.00 | 1 | |
|
| 14:root | 0.00 | 0.00 | 0.00 | 2 | |
|
| micro avg | 0.82 | 0.82 | 0.82 | 15436 | |
|
| macro avg | 0.31 | 0.31 | 0.30 | 15436 | |
|
| weighted avg | 0.81 | 0.82 | 0.81 | 15436 | |
|
| tot root avg | 0.47 | 0.49 | 0.48 | 344 | |
|
| tot arg0:agt avg | 0.46 | 0.47 | 0.46 | 2257 | |
|
| tot arg0:cau avg | 0.41 | 0.32 | 0.34 | 166 | |
|
| tot arg0:exp avg | 0.00 | 0.00 | 0.00 | 1 | |
|
| tot arg0:src avg | 0.00 | 0.00 | 0.00 | 2 | |
|
| tot arg0 | 0.39 | 0.36 | 0.36 | 2426 | |
|
| tot arg1:ext avg | 0.00 | 0.00 | 0.00 | 5 | |
|
| tot arg1:loc avg | 0.00 | 0.00 | 0.00 | 1 | |
|
| tot arg1:pat avg | 0.39 | 0.42 | 0.40 | 1770 | |
|
| tot arg1:tem avg | 0.41 | 0.40 | 0.40 | 1635 | |
|
| tot arg1 | 0.36 | 0.37 | 0.36 | 3411 | |
|
| tot arg2:atr avg | 0.33 | 0.36 | 0.35 | 794 | |
|
| tot arg2:ben avg | 0.33 | 0.42 | 0.36 | 255 | |
|
| tot arg2:efi avg | 0.41 | 0.30 | 0.34 | 24 | |
|
| tot arg2:exp avg | 0.00 | 0.00 | 0.00 | 6 | |
|
| tot arg2:ext avg | 0.18 | 0.19 | 0.18 | 33 | |
|
| tot arg2:ins avg | 0.00 | 0.00 | 0.00 | 2 | |
|
| tot arg2:loc avg | 0.31 | 0.29 | 0.30 | 165 | |
|
| tot arg2 | 0.30 | 0.31 | 0.30 | 1279 | |
|
| tot arg3:ben avg | 0.20 | 0.04 | 0.07 | 15 | |
|
| tot arg3:ein avg | 0.00 | 0.00 | 0.00 | 9 | |
|
| tot arg3:fin avg | 0.75 | 0.75 | 0.75 | 4 | |
|
| tot arg3:ori avg | 0.16 | 0.30 | 0.20 | 21 | |
|
| tot arg3 | 0.18 | 0.19 | 0.16 | 49 | |
|
| tot arg4:des avg | 0.48 | 0.69 | 0.54 | 61 | |
|
| tot arg4:efi avg | 0.04 | 0.03 | 0.04 | 20 | |
|
| tot arg4 | 0.28 | 0.39 | 0.31 | 81 | |
|
| tot argM:adv avg | 0.24 | 0.23 | 0.23 | 876 | |
|
| tot argM:atr avg | 0.23 | 0.19 | 0.21 | 73 | |
|
| tot argM:cau avg | 0.37 | 0.28 | 0.30 | 115 | |
|
| tot argM:ext avg | 0.06 | 0.02 | 0.03 | 19 | |
|
| tot argM:fin avg | 0.29 | 0.33 | 0.30 | 158 | |
|
| tot argM:ins avg | 0.00 | 0.00 | 0.00 | 1 | |
|
| tot argM:loc avg | 0.35 | 0.41 | 0.37 | 591 | |
|
| tot argM:mnr avg | 0.32 | 0.21 | 0.24 | 186 | |
|
| tot argM:tmp avg | 0.35 | 0.37 | 0.36 | 1013 | |
|
| tot argM | 0.29 | 0.27 | 0.27 | 3032 | |
|
| tot r0 avg | 0.57 | 0.54 | 0.54 | 5242 | |
|
| tot r1 avg | 0.54 | 0.56 | 0.54 | 3913 | |
|
| tot r2 avg | 0.46 | 0.48 | 0.46 | 2711 | |
|
| tot r3 avg | 0.41 | 0.43 | 0.42 | 1626 | |
|
| tot r4 avg | 0.47 | 0.41 | 0.42 | 892 | |
|
| tot r5 avg | 0.42 | 0.40 | 0.38 | 487 | |
|
| tot r6 avg | 0.18 | 0.23 | 0.19 | 216 | |
|
| tot r7 avg | 0.12 | 0.15 | 0.13 | 135 | |
|
| tot r8 avg | 0.03 | 0.05 | 0.03 | 71 | |
|
| tot r9 avg | 0.00 | 0.00 | 0.00 | 49 | |
|
| tot r10 avg | 0.00 | 0.00 | 0.00 | 31 | |
|
| tot r11 avg | 0.00 | 0.00 | 0.00 | 27 | |
|
| tot r12 avg | 0.00 | 0.00 | 0.00 | 20 | |
|
| tot r13 avg | 0.00 | 0.00 | 0.00 | 10 | |
|
| tot r14 avg | 0.00 | 0.00 | 0.00 | 5 | |
|
|
|
## Citation |
|
|
|
**BibTeX:** |
|
|
|
``` |
|
@mastersthesis{bruton-galician-srl-23, |
|
author = {Bruton, Micaella}, |
|
title = {BERTie Bott's Every Flavor Labels: A Tasty Guide to Developing a Semantic Role Labeling Model for Galician}, |
|
school = {Uppsala University}, |
|
year = {2023}, |
|
type = {Master's thesis}, |
|
} |
|
``` |