asier-gutierrez
commited on
Commit
•
2aceba4
1
Parent(s):
edd990b
Update README.md
Browse files
README.md
CHANGED
@@ -21,7 +21,7 @@ RoBERTa-large-bne is a transformer-based masked language model for the Spanish l
|
|
21 |
Original pre-trained model can be found here: https://huggingface.co/BSC-TeMU/roberta-large-bne
|
22 |
|
23 |
## Dataset
|
24 |
-
The dataset used is the
|
25 |
|
26 |
## Evaluation and results
|
27 |
F1 Score: 0.7993 (average of 5 runs).
|
|
|
21 |
Original pre-trained model can be found here: https://huggingface.co/BSC-TeMU/roberta-large-bne
|
22 |
|
23 |
## Dataset
|
24 |
+
The dataset used is the [SQAC corpus](https://huggingface.co/datasets/BSC-TeMU/SQAC).
|
25 |
|
26 |
## Evaluation and results
|
27 |
F1 Score: 0.7993 (average of 5 runs).
|