Feature | Description |
---|---|
Name | es_roberta_base_bne_leetspeak_ner |
Version | 0.0.0 |
spaCy | >=3.2.1,<3.3.0 |
Default Pipeline | transformer , ner |
Components | transformer , ner |
Vectors | 0 keys, 0 unique vectors (0 dimensions) |
Sources | PlanTL-GOB-ES/roberta-base-bne model a transformer-based masked language model for the Spanish language pre-trained with a total of 570GB of clean and deduplicated text compiled from the web crawlings performed by the National Library of Spain (Biblioteca Nacional de España) LeetSpeak-NER app where this model is in production for countering information disorders |
License | Apache 2.0 |
Author | Álvaro Huertas García at AI+DA |
Label Scheme
View label scheme (4 labels for 1 components)
Component | Labels |
---|---|
ner |
INV_CAMO , LEETSPEAK , MIX , PUNCT_CAMO |
Accuracy
Type | Score |
---|---|
ENTS_F |
91.82 |
ENTS_P |
89.79 |
ENTS_R |
93.94 |
TRANSFORMER_LOSS |
166484.92 |
NER_LOSS |
318457.35 |
- Downloads last month
- 3
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Evaluation results
- NER Precisionself-reported0.898
- NER Recallself-reported0.939
- NER F Scoreself-reported0.918