Edit model card
Feature Description
Name es_roberta_base_bne_leetspeak_ner
Version 0.0.0
spaCy >=3.2.1,<3.3.0
Default Pipeline transformer, ner
Components transformer, ner
Vectors 0 keys, 0 unique vectors (0 dimensions)
Sources PlanTL-GOB-ES/roberta-base-bne model a transformer-based masked language model for the Spanish language pre-trained with a total of 570GB of clean and deduplicated text compiled from the web crawlings performed by the National Library of Spain (Biblioteca Nacional de España)
LeetSpeak-NER app where this model is in production for countering information disorders
License Apache 2.0
Author Álvaro Huertas García at AI+DA

Label Scheme

View label scheme (4 labels for 1 components)
Component Labels
ner INV_CAMO, LEETSPEAK, MIX, PUNCT_CAMO

Accuracy

Type Score
ENTS_F 91.82
ENTS_P 89.79
ENTS_R 93.94
TRANSFORMER_LOSS 166484.92
NER_LOSS 318457.35
Downloads last month
5
Hosted inference API
Token Classification
Examples
Examples
This model can be loaded on the Inference API on-demand.

Evaluation results