spa_enpt_XLM-R / README.md
mbruton's picture
Create README.md
525f762
---
license: apache-2.0
datasets:
- mbruton/spanish_srl
- CoNLL-2012
- PropBank.Br
language:
- es
- en
- pt
metrics:
- seqeval
library_name: transformers
pipeline_tag: token-classification
---
# Model Card for SpaXLM-R-enpt for Semantic Role Labeling
This model is fine-tuned on a version of [XLM RoBERTa Base](https://huggingface.co/xlm-roberta-base) which is pre-trained on the SRL task for English and Portuguese, and is one of 24 models introduced as part of [this project](https://github.com/mbruton0426/GalicianSRL).
## Model Details
### Model Description
SpaXLM-R-enpt for Semantic Role Labeling (SRL) is a transformers model, leveraging XLM-R's extensive pretraining on 100 languages to achieve better SRL predictions for Spanish. This model is additionally pre-trained on the SRL task for English and Portuguese. It was fine-tuned on Spanish with the following objectives:
- Identify up to 16 verbal roots within a sentence.
- Identify available arguments and thematic roles for each verbal root.
Labels are formatted as: r#:tag, where r# links the token to a specific verbal root of index #, and tag identifies the token as the verbal root (root) or an individual argument (arg0/arg1/arg2/arg3/argM) and its thematic role (adv/agt/atr/ben/cau/cot/des/efi/ein/exp/ext/fin/ins/loc/mnr/ori/pat/src/tem/tmp)
- **Developed by:** [Micaella Bruton](mailto:micaellabruton@gmail.com)
- **Model type:** Transformers
- **Language(s) (NLP):** Spanish (es), English (en), Portuguese (pt)
- **License:** Apache 2.0
- **Finetuned from model:** [English & Portuguese pre-trained XLM RoBERTa Base](https://huggingface.co/liaad/srl-enpt_xlmr-base)
### Model Sources
- **Repository:** [GalicianSRL](https://github.com/mbruton0426/GalicianSRL)
- **Paper:** To be updated
## Uses
This model is intended to be used to develop and improve natural language processing tools for Spanish.
## Bias, Risks, and Limitations
The Spanish training set lacked highly complex sentences and as such, performs much better on sentences of mid- to low-complexity.
## Training Details
### Training Data
This model was pre-trained on the [OntoNotes 5.0 English SRL corpus](http://catalog.ldc.upenn.edu/LDC2013T19) and [PropBank.Br Portuguese SRL corpus](http://www.nilc.icmc.usp.br/portlex/index.php/en/projects/propbankbringl).
This model was fine-tuned on the "train" portion of the [SpanishSRL Dataset](https://huggingface.co/datasets/mbruton/spanish_srl) produced as part of this same project.
#### Training Hyperparameters
- **Learning Rate:** 2e-5
- **Batch Size:** 16
- **Weight Decay:** 0.01
- **Early Stopping:** 10 epochs
## Evaluation
#### Testing Data
This model was tested on the "test" portion of the [SpanishSRL Dataset](https://huggingface.co/datasets/mbruton/spanish_srl) produced as part of this same project.
#### Metrics
[seqeval](https://huggingface.co/spaces/evaluate-metric/seqeval) is a Python framework for sequence labeling evaluation. It can evaluate the performance of chunking tasks such as named-entity recognition, part-of-speech tagging, and semantic role labeling.
It supplies scoring both overall and per label type.
Overall:
- `accuracy`: the average [accuracy](https://huggingface.co/metrics/accuracy), on a scale between 0.0 and 1.0.
- `precision`: the average [precision](https://huggingface.co/metrics/precision), on a scale between 0.0 and 1.0.
- `recall`: the average [recall](https://huggingface.co/metrics/recall), on a scale between 0.0 and 1.0.
- `f1`: the average [F1 score](https://huggingface.co/metrics/f1), which is the harmonic mean of the precision and recall. It also has a scale of 0.0 to 1.0.
Per label type:
- `precision`: the average [precision](https://huggingface.co/metrics/precision), on a scale between 0.0 and 1.0.
- `recall`: the average [recall](https://huggingface.co/metrics/recall), on a scale between 0.0 and 1.0.
- `f1`: the average [F1 score](https://huggingface.co/metrics/f1), on a scale between 0.0 and 1.0.
### Results
| Label | Precision | Recall | f1-score | Support |
| :----------: | :-------: | :----: | :------: | :-----: |
| 0:arg0:agt | 0.93 | 0.93 | 0.93 | 867 |
| 0:arg0:cau | 0.72 | 0.63 | 0.67 | 57 |
| 0:arg0:src | 0.00 | 0.00 | 0.00 | 1 |
| 0:arg1:ext | 0.00 | 0.00 | 0.00 | 3 |
| 0:arg1:pat | 0.87 | 0.91 | 0.89 | 536 |
| 0:arg1:tem | 0.89 | 0.88 | 0.89 | 589 |
| 0:arg2:atr | 0.85 | 0.88 | 0.87 | 278 |
| 0:arg2:ben | 0.82 | 0.85 | 0.84 | 78 |
| 0:arg2:efi | 0.75 | 0.43 | 0.55 | 7 |
| 0:arg2:exp | 0.62 | 0.83 | 0.71 | 6 |
| 0:arg2:ext | 0.58 | 0.47 | 0.52 | 15 |
| 0:arg2:loc | 0.56 | 0.53 | 0.54 | 57 |
| 0:arg3:ben | 0.00 | 0.00 | 0.00 | 5 |
| 0:arg3:ein | 0.00 | 0.00 | 0.00 | 1 |
| 0:arg3:fin | 1.00 | 0.50 | 0.67 | 2 |
| 0:arg3:ori | 0.44 | 0.40 | 0.42 | 10 |
| 0:arg4:des | 0.50 | 0.88 | 0.64 | 16 |
| 0:arg4:efi | 1.00 | 0.40 | 0.57 | 5 |
| 0:argM:adv | 0.63 | 0.49 | 0.55 | 268 |
| 0:argM:atr | 0.57 | 0.54 | 0.55 | 24 |
| 0:argM:cau | 0.68 | 0.56 | 0.61 | 41 |
| 0:argM:ext | 0.00 | 0.00 | 0.00 | 5 |
| 0:argM:fin | 0.81 | 0.74 | 0.77 | 46 |
| 0:argM:loc | 0.70 | 0.81 | 0.75 | 186 |
| 0:argM:mnr | 0.57 | 0.61 | 0.59 | 66 |
| 0:argM:tmp | 0.86 | 0.86 | 0.86 | 411 |
| 0:root | 0.99 | 0.99 | 0.99 | 1662 |
| 1:arg0:agt | 0.91 | 0.92 | 0.92 | 564 |
| 1:arg0:cau | 0.79 | 0.75 | 0.77 | 44 |
| 1:arg1:ext | 0.00 | 0.00 | 0.00 | 2 |
| 1:arg1:pat | 0.86 | 0.90 | 0.88 | 482 |
| 1:arg1:tem | 0.92 | 0.87 | 0.89 | 390 |
| 1:arg2:atr | 0.87 | 0.90 | 0.89 | 197 |
| 1:arg2:ben | 0.78 | 0.86 | 0.82 | 66 |
| 1:arg2:efi | 1.00 | 0.33 | 0.50 | 6 |
| 1:arg2:ext | 0.50 | 0.43 | 0.46 | 7 |
| 1:arg2:ins | 0.00 | 0.00 | 0.00 | 1 |
| 1:arg2:loc | 0.61 | 0.57 | 0.59 | 44 |
| 1:arg3:ben | 0.00 | 0.00 | 0.00 | 2 |
| 1:arg3:ein | 0.00 | 0.00 | 0.00 | 3 |
| 1:arg3:fin | 0.00 | 0.00 | 0.00 | 2 |
| 1:arg3:ori | 0.12 | 0.50 | 0.20 | 2 |
| 1:arg4:des | 0.48 | 1.00 | 0.65 | 10 |
| 1:arg4:efi | 0.00 | 0.00 | 0.00 | 2 |
| 1:argM:adv | 0.65 | 0.55 | 0.60 | 220 |
| 1:argM:atr | 0.71 | 0.79 | 0.75 | 19 |
| 1:argM:cau | 0.54 | 0.63 | 0.58 | 35 |
| 1:argM:ext | 0.00 | 0.00 | 0.00 | 7 |
| 1:argM:fin | 0.53 | 0.66 | 0.59 | 38 |
| 1:argM:loc | 0.67 | 0.79 | 0.73 | 156 |
| 1:argM:mnr | 0.47 | 0.43 | 0.45 | 44 |
| 1:argM:tmp | 0.79 | 0.82 | 0.80 | 247 |
| 1:root | 0.96 | 0.97 | 0.96 | 1323 |
| 2:arg0:agt | 0.87 | 0.90 | 0.88 | 336 |
| 2:arg0:cau | 0.81 | 0.71 | 0.76 | 35 |
| 2:arg0:exp | 0.00 | 0.00 | 0.00 | 1 |
| 2:arg0:src | 0.00 | 0.00 | 0.00 | 1 |
| 2:arg1:pat | 0.84 | 0.83 | 0.84 | 333 |
| 2:arg1:tem | 0.84 | 0.84 | 0.84 | 291 |
| 2:arg2:atr | 0.87 | 0.87 | 0.87 | 124 |
| 2:arg2:ben | 0.67 | 0.70 | 0.68 | 43 |
| 2:arg2:efi | 0.83 | 0.56 | 0.67 | 9 |
| 2:arg2:ext | 0.60 | 0.60 | 0.60 | 5 |
| 2:arg2:ins | 0.00 | 0.00 | 0.00 | 1 |
| 2:arg2:loc | 0.38 | 0.67 | 0.49 | 27 |
| 2:arg3:ben | 0.00 | 0.00 | 0.00 | 4 |
| 2:arg3:ein | 0.00 | 0.00 | 0.00 | 1 |
| 2:arg3:ori | 0.40 | 0.67 | 0.50 | 3 |
| 2:arg4:des | 0.58 | 0.88 | 0.70 | 16 |
| 2:arg4:efi | 0.00 | 0.00 | 0.00 | 6 |
| 2:argM:adv | 0.57 | 0.55 | 0.56 | 176 |
| 2:argM:atr | 0.71 | 0.33 | 0.45 | 15 |
| 2:argM:cau | 0.42 | 0.76 | 0.54 | 17 |
| 2:argM:ext | 0.00 | 0.00 | 0.00 | 4 |
| 2:argM:fin | 0.76 | 0.69 | 0.72 | 36 |
| 2:argM:ins | 0.00 | 0.00 | 0.00 | 1 |
| 2:argM:loc | 0.69 | 0.74 | 0.71 | 117 |
| 2:argM:mnr | 0.40 | 0.23 | 0.29 | 35 |
| 2:argM:tmp | 0.80 | 0.73 | 0.76 | 161 |
| 2:root | 0.94 | 0.93 | 0.94 | 913 |
| 3:arg0:agt | 0.86 | 0.82 | 0.84 | 227 |
| 3:arg0:cau | 0.69 | 0.64 | 0.67 | 14 |
| 3:arg1:pat | 0.82 | 0.86 | 0.84 | 199 |
| 3:arg1:tem | 0.79 | 0.79 | 0.79 | 160 |
| 3:arg2:atr | 0.75 | 0.80 | 0.77 | 79 |
| 3:arg2:ben | 0.86 | 0.89 | 0.87 | 27 |
| 3:arg2:efi | 0.00 | 0.00 | 0.00 | 1 |
| 3:arg2:ext | 0.00 | 0.00 | 0.00 | 3 |
| 3:arg2:loc | 0.42 | 0.52 | 0.47 | 21 |
| 3:arg3:ben | 0.00 | 0.00 | 0.00 | 3 |
| 3:arg3:ein | 0.00 | 0.00 | 0.00 | 2 |
| 3:arg3:ori | 0.00 | 0.00 | 0.00 | 3 |
| 3:arg4:des | 0.39 | 1.00 | 0.56 | 7 |
| 3:arg4:efi | 0.00 | 0.00 | 0.00 | 5 |
| 3:argM:adv | 0.53 | 0.47 | 0.50 | 98 |
| 3:argM:atr | 0.00 | 0.00 | 0.00 | 7 |
| 3:argM:cau | 0.44 | 0.54 | 0.48 | 13 |
| 3:argM:ext | 0.00 | 0.00 | 0.00 | 1 |
| 3:argM:fin | 0.48 | 0.80 | 0.60 | 15 |
| 3:argM:loc | 0.61 | 0.68 | 0.64 | 69 |
| 3:argM:mnr | 0.42 | 0.45 | 0.43 | 11 |
| 3:argM:tmp | 0.88 | 0.73 | 0.80 | 92 |
| 3:root | 0.90 | 0.91 | 0.90 | 569 |
| 4:arg0:agt | 0.76 | 0.87 | 0.81 | 119 |
| 4:arg0:cau | 1.00 | 0.67 | 0.80 | 6 |
| 4:arg1:pat | 0.78 | 0.79 | 0.79 | 87 |
| 4:arg1:tem | 0.80 | 0.76 | 0.78 | 109 |
| 4:arg2:atr | 0.77 | 0.87 | 0.81 | 53 |
| 4:arg2:ben | 0.64 | 0.64 | 0.64 | 11 |
| 4:arg2:ext | 0.00 | 0.00 | 0.00 | 1 |
| 4:arg2:loc | 0.73 | 0.73 | 0.73 | 11 |
| 4:arg3:ein | 0.00 | 0.00 | 0.00 | 1 |
| 4:arg3:ori | 0.00 | 0.00 | 0.00 | 1 |
| 4:arg4:des | 0.60 | 0.60 | 0.60 | 10 |
| 4:arg4:efi | 0.00 | 0.00 | 0.00 | 1 |
| 4:argM:adv | 0.42 | 0.56 | 0.48 | 50 |
| 4:argM:atr | 0.00 | 0.00 | 0.00 | 4 |
| 4:argM:cau | 0.11 | 0.33 | 0.17 | 3 |
| 4:argM:ext | 0.00 | 0.00 | 0.00 | 1 |
| 4:argM:fin | 0.46 | 0.55 | 0.50 | 11 |
| 4:argM:loc | 0.50 | 0.79 | 0.61 | 24 |
| 4:argM:mnr | 0.00 | 0.00 | 0.00 | 16 |
| 4:argM:tmp | 0.74 | 0.75 | 0.74 | 52 |
| 4:root | 0.88 | 0.87 | 0.87 | 322 |
| 5:arg0:agt | 0.77 | 0.81 | 0.79 | 72 |
| 5:arg0:cau | 0.00 | 0.00 | 0.00 | 5 |
| 5:arg1:pat | 0.71 | 0.79 | 0.75 | 71 |
| 5:arg1:tem | 0.69 | 0.59 | 0.63 | 41 |
| 5:arg2:atr | 0.61 | 0.52 | 0.56 | 21 |
| 5:arg2:ben | 0.43 | 0.50 | 0.46 | 6 |
| 5:arg2:efi | 0.00 | 0.00 | 0.00 | 1 |
| 5:arg2:ext | 0.00 | 0.00 | 0.00 | 1 |
| 5:arg2:loc | 0.00 | 0.00 | 0.00 | 1 |
| 5:arg3:ein | 0.00 | 0.00 | 0.00 | 1 |
| 5:arg4:des | 0.00 | 0.00 | 0.00 | 1 |
| 5:arg4:efi | 0.00 | 0.00 | 0.00 | 1 |
| 5:argM:adv | 0.46 | 0.46 | 0.46 | 26 |
| 5:argM:cau | 0.00 | 0.00 | 0.00 | 3 |
| 5:argM:fin | 0.50 | 0.60 | 0.55 | 5 |
| 5:argM:loc | 0.69 | 0.52 | 0.59 | 21 |
| 5:argM:mnr | 0.00 | 0.00 | 0.00 | 7 |
| 5:argM:tmp | 0.65 | 0.67 | 0.66 | 30 |
| 5:root | 0.77 | 0.80 | 0.79 | 173 |
| 6:arg0:agt | 0.70 | 0.47 | 0.56 | 34 |
| 6:arg0:cau | 0.00 | 0.00 | 0.00 | 1 |
| 6:arg1:loc | 0.00 | 0.00 | 0.00 | 1 |
| 6:arg1:pat | 0.54 | 0.71 | 0.62 | 28 |
| 6:arg1:tem | 0.50 | 0.44 | 0.47 | 16 |
| 6:arg2:atr | 0.24 | 0.38 | 0.29 | 13 |
| 6:arg2:ben | 0.45 | 1.00 | 0.62 | 5 |
| 6:arg2:loc | 0.00 | 0.00 | 0.00 | 1 |
| 6:arg3:ben | 0.00 | 0.00 | 0.00 | 1 |
| 6:argM:adv | 0.21 | 0.60 | 0.31 | 10 |
| 6:argM:atr | 0.00 | 0.00 | 0.00 | 2 |
| 6:argM:cau | 0.00 | 0.00 | 0.00 | 1 |
| 6:argM:fin | 0.00 | 0.00 | 0.00 | 2 |
| 6:argM:loc | 0.26 | 0.86 | 0.40 | 7 |
| 6:argM:mnr | 0.00 | 0.00 | 0.00 | 5 |
| 6:argM:tmp | 0.27 | 0.57 | 0.36 | 7 |
| 6:root | 0.65 | 0.62 | 0.63 | 82 |
| 7:arg0:agt | 0.41 | 0.94 | 0.57 | 17 |
| 7:arg1:pat | 0.57 | 0.71 | 0.63 | 17 |
| 7:arg1:tem | 0.41 | 0.60 | 0.49 | 15 |
| 7:arg2:atr | 0.31 | 0.27 | 0.29 | 15 |
| 7:arg2:ben | 0.00 | 0.00 | 0.00 | 7 |
| 7:arg2:loc | 0.00 | 0.00 | 0.00 | 1 |
| 7:arg3:ori | 0.00 | 0.00 | 0.00 | 1 |
| 7:arg4:des | 0.00 | 0.00 | 0.00 | 1 |
| 7:argM:adv | 0.00 | 0.00 | 0.00 | 5 |
| 7:argM:atr | 0.00 | 0.00 | 0.00 | 1 |
| 7:argM:fin | 0.00 | 0.00 | 0.00 | 1 |
| 7:argM:loc | 0.00 | 0.00 | 0.00 | 3 |
| 7:argM:tmp | 0.00 | 0.00 | 0.00 | 6 |
| 7:root | 0.64 | 0.76 | 0.69 | 45 |
| 8:arg0:agt | 0.07 | 0.12 | 0.09 | 8 |
| 8:arg0:cau | 0.00 | 0.00 | 0.00 | 1 |
| 8:arg1:pat | 0.00 | 0.00 | 0.00 | 4 |
| 8:arg1:tem | 0.12 | 0.11 | 0.12 | 9 |
| 8:arg2:atr | 0.00 | 0.00 | 0.00 | 4 |
| 8:arg2:ext | 0.00 | 0.00 | 0.00 | 1 |
| 8:arg2:loc | 0.00 | 0.00 | 0.00 | 2 |
| 8:arg3:ori | 0.00 | 0.00 | 0.00 | 1 |
| 8:argM:adv | 0.12 | 0.25 | 0.16 | 8 |
| 8:argM:ext | 0.00 | 0.00 | 0.00 | 1 |
| 8:argM:fin | 0.00 | 0.00 | 0.00 | 1 |
| 8:argM:loc | 0.00 | 0.00 | 0.00 | 4 |
| 8:argM:mnr | 0.00 | 0.00 | 0.00 | 1 |
| 8:argM:tmp | 0.25 | 1.00 | 0.40 | 1 |
| 8:root | 0.39 | 0.80 | 0.53 | 25 |
| 9:arg0:agt | 0.00 | 0.00 | 0.00 | 6 |
| 9:arg0:cau | 0.00 | 0.00 | 0.00 | 1 |
| 9:arg1:pat | 0.00 | 0.00 | 0.00 | 4 |
| 9:arg1:tem | 0.00 | 0.00 | 0.00 | 5 |
| 9:arg2:atr | 0.00 | 0.00 | 0.00 | 3 |
| 9:arg2:ben | 0.00 | 0.00 | 0.00 | 1 |
| 9:argM:adv | 0.00 | 0.00 | 0.00 | 6 |
| 9:argM:cau | 0.00 | 0.00 | 0.00 | 1 |
| 9:argM:fin | 0.00 | 0.00 | 0.00 | 2 |
| 9:argM:loc | 0.00 | 0.00 | 0.00 | 2 |
| 9:argM:tmp | 0.00 | 0.00 | 0.00 | 1 |
| 9:root | 0.06 | 0.12 | 0.08 | 17 |
| 10:arg0:agt | 0.00 | 0.00 | 0.00 | 3 |
| 10:arg1:pat | 0.00 | 0.00 | 0.00 | 5 |
| 10:arg1:tem | 0.00 | 0.00 | 0.00 | 3 |
| 10:arg2:atr | 0.00 | 0.00 | 0.00 | 1 |
| 10:arg2:ben | 0.00 | 0.00 | 0.00 | 2 |
| 10:argM:adv | 0.00 | 0.00 | 0.00 | 3 |
| 10:argM:fin | 0.00 | 0.00 | 0.00 | 1 |
| 10:argM:tmp | 0.00 | 0.00 | 0.00 | 1 |
| 10:root | 0.00 | 0.00 | 0.00 | 12 |
| 11:arg0:agt | 0.00 | 0.00 | 0.00 | 1 |
| 11:arg0:cau | 0.00 | 0.00 | 0.00 | 1 |
| 11:arg1:pat | 0.00 | 0.00 | 0.00 | 2 |
| 11:arg1:tem | 0.00 | 0.00 | 0.00 | 4 |
| 11:arg2:atr | 0.00 | 0.00 | 0.00 | 3 |
| 11:arg2:ben | 0.00 | 0.00 | 0.00 | 1 |
| 11:argM:adv | 0.00 | 0.00 | 0.00 | 4 |
| 11:argM:loc | 0.00 | 0.00 | 0.00 | 1 |
| 11:argM:tmp | 0.00 | 0.00 | 0.00 | 1 |
| 11:root | 0.00 | 0.00 | 0.00 | 9 |
| 12:arg0:agt | 0.00 | 0.00 | 0.00 | 3 |
| 12:arg1:pat | 0.00 | 0.00 | 0.00 | 1 |
| 12:arg1:tem | 0.00 | 0.00 | 0.00 | 2 |
| 12:arg2:atr | 0.00 | 0.00 | 0.00 | 2 |
| 12:argM:adv | 0.00 | 0.00 | 0.00 | 1 |
| 12:argM:cau | 0.00 | 0.00 | 0.00 | 1 |
| 12:argM:tmp | 0.00 | 0.00 | 0.00 | 3 |
| 12:root | 0.00 | 0.00 | 0.00 | 7 |
| 13:arg0:cau | 0.00 | 0.00 | 0.00 | 1 |
| 13:arg1:tem | 0.00 | 0.00 | 0.00 | 1 |
| 13:arg2:atr | 0.00 | 0.00 | 0.00 | 1 |
| 13:argM:adv | 0.00 | 0.00 | 0.00 | 1 |
| 13:argM:atr | 0.00 | 0.00 | 0.00 | 1 |
| 13:argM:loc | 0.00 | 0.00 | 0.00 | 1 |
| 13:root | 0.00 | 0.00 | 0.00 | 4 |
| 14:arg1:pat | 0.00 | 0.00 | 0.00 | 1 |
| 14:arg2:ben | 0.00 | 0.00 | 0.00 | 1 |
| 14:argM:mnr | 0.00 | 0.00 | 0.00 | 1 |
| 14:root | 0.00 | 0.00 | 0.00 | 2 |
| micro avg | 0.83 | 0.83 | 0.83 | 15436 |
| macro avg | 0.31 | 0.34 | 0.31 | 15436 |
| weighted avg | 0.82 | 0.83 | 0.83 | 15436 |
| tot root avg | 0.48 | 0.52 | 0.49 | 344 |
| tot arg0:agt avg | 0.48 | 0.52 | 0.49 | 2257 |
| tot arg0:cau avg | 0.36 | 0.31 | 0.33 | 166 |
| tot arg0:exp avg | 0.00 | 0.00 | 0.00 | 1 |
| tot arg0:src avg | 0.00 | 0.00 | 0.00 | 2 |
| tot arg0 | 0.38 | 0.38 | 0.37 | 2426 |
| tot arg1:ext avg | 0.00 | 0.00 | 0.00 | 5 |
| tot arg1:loc avg | 0.00 | 0.00 | 0.00 | 1 |
| tot arg1:pat avg | 0.43 | 0.46 | 0.45 | 1770 |
| tot arg1:tem avg | 0.43 | 0.42 | 0.42 | 1635 |
| tot arg1 | 0.39 | 0.40 | 0.39 | 3411 |
| tot arg2:atr avg | 0.38 | 0.39 | 0.38 | 794 |
| tot arg2:ben avg | 0.39 | 0.50 | 0.42 | 255 |
| tot arg2:efi avg | 0.52 | 0.26 | 0.34 | 24 |
| tot arg2:exp avg | 0.62 | 0.83 | 0.71 | 6 |
| tot arg2:ext avg | 0.24 | 0.21 | 0.23 | 33 |
| tot arg2:ins avg | 0.00 | 0.00 | 0.00 | 2 |
| tot arg2:loc avg | 0.30 | 0.34 | 0.31 | 165 |
| tot arg2 | 0.35 | 0.36 | 0.35 | 1279 |
| tot arg3:ben avg | 0.00 | 0.00 | 0.00 | 15 |
| tot arg3:ein avg | 0.00 | 0.00 | 0.00 | 9 |
| tot arg3:fin avg | 0.50 | 0.25 | 0.34 | 4 |
| tot arg3:ori avg | 0.14 | 0.22 | 0.16 | 21 |
| tot arg3 | 0.10 | 0.10 | 0.09 | 49 |
| tot arg4:des avg | 0.36 | 0.62 | 0.45 | 61 |
| tot arg4:efi avg | 0.17 | 0.07 | 0.10 | 20 |
| tot arg4 | 0.27 | 0.37 | 0.29 | 81 |
| tot argM:adv avg | 0.26 | 0.28 | 0.26 | 876 |
| tot argM:atr avg | 0.25 | 0.21 | 0.22 | 73 |
| tot argM:cau avg | 0.24 | 0.31 | 0.26 | 115 |
| tot argM:ext avg | 0.00 | 0.00 | 0.00 | 19 |
| tot argM:fin avg | 0.32 | 0.37 | 0.34 | 158 |
| tot argM:ins avg | 0.00 | 0.00 | 0.00 | 1 |
| tot argM:loc avg | 0.34 | 0.43 | 0.37 | 591 |
| tot argM:mnr avg | 0.21 | 0.19 | 0.20 | 186 |
| tot argM:tmp avg | 0.40 | 0.47 | 0.41 | 1013 |
| tot argM | 0.27 | 0.31 | 0.28 | 3032 |
| tot r0 avg | 0.61 | 0.56 | 0.57 | 5242 |
| tot r1 avg | 0.51 | 0.53 | 0.50 | 3913 |
| tot r2 avg | 0.48 | 0.49 | 0.47 | 2711 |
| tot r3 avg | 0.43 | 0.47 | 0.44 | 1626 |
| tot r4 avg | 0.44 | 0.47 | 0.44 | 892 |
| tot r5 avg | 0.33 | 0.33 | 0.33 | 487 |
| tot r6 avg | 0.22 | 0.33 | 0.25 | 216 |
| tot r7 avg | 0.17 | 0.23 | 0.19 | 135 |
| tot r8 avg | 0.06 | 0.15 | 0.09 | 71 |
| tot r9 avg | 0.01 | 0.01 | 0.01 | 49 |
| tot r10 avg | 0.00 | 0.00 | 0.00 | 31 |
| tot r11 avg | 0.00 | 0.00 | 0.00 | 27 |
| tot r12 avg | 0.00 | 0.00 | 0.00 | 20 |
| tot r13 avg | 0.00 | 0.00 | 0.00 | 10 |
| tot r14 avg | 0.00 | 0.00 | 0.00 | 5 |
## Citation
**BibTeX:**
```
@mastersthesis{bruton-galician-srl-23,
author = {Bruton, Micaella},
title = {BERTie Bott's Every Flavor Labels: A Tasty Guide to Developing a Semantic Role Labeling Model for Galician},
school = {Uppsala University},
year = {2023},
type = {Master's thesis},
}
```