Name |
es_roberta_base_bne_leetspeak_ner |
Version |
0.0.0 |
spaCy |
>=3.2.1,<3.3.0 |
Default Pipeline |
transformer , ner |
Components |
transformer , ner |
Vectors |
0 keys, 0 unique vectors (0 dimensions) |
Sources |
PlanTL-GOB-ES/roberta-base-bne model a transformer-based masked language model for the Spanish language pre-trained with a total of 570GB of clean and deduplicated text compiled from the web crawlings performed by the National Library of Spain (Biblioteca Nacional de España) LeetSpeak-NER app where this model is in production for countering information disorders |
License |
Apache 2.0 |
Author |
Álvaro Huertas García at AI+DA |