DistilBERT base uncased, fine-tuned for NER using the conll03 english dataset. Note that this model is not sensitive to capital letters โ "english" is the same as "English". For the case sensitive version, please use elastic/distilbert-base-cased-finetuned-conll03-english.
Versions
- Transformers version: 4.3.1
- Datasets version: 1.3.0
Training
$ run_ner.py \
--model_name_or_path distilbert-base-uncased \
--label_all_tokens True \
--return_entity_level_metrics True \
--dataset_name conll2003 \
--output_dir /tmp/distilbert-base-uncased-finetuned-conll03-english \
--do_train \
--do_eval
After training, we update the labels to match the NER specific labels from the dataset conll2003
- Downloads last month
- 66
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Dataset used to train 51la5/distilbert-base-NER
Evaluation results
- Accuracy on conll2003validation set self-reported0.985
- Precision on conll2003validation set self-reported0.988
- Recall on conll2003validation set self-reported0.990
- F1 on conll2003validation set self-reported0.989
- loss on conll2003validation set self-reported0.067