Edit model card

distilbert-finetuned-ner

This is a fine-tuned version of the distilbert-base-cased model on the CoNLL-2003 dataset and is ready to use for named entity recognition (NER).

It achieves the following results on the evaluation set:

  • Train Loss: 0.031400
  • Validation Loss: 0.070702
  • Validation Accuracy: 0.983061
  • Precision: 0.912730
  • Recall: 0.936385
  • F1: 0.924406
  • Accuracy: 0.983061

How to use

You can use this model directly with a pipeline for token classification:

from transformers import pipeline
checkpoint = "rasyosef/distilbert-finetuned-ner"
token_classifier = pipeline("token-classification", model=checkpoint, aggregation_strategy="simple")
token_classifier("My name is Tony Stark and I work at Stark Industries in Los Angeles.")

Output:

[{'entity_group': 'PER',
  'score': 0.99873567,
  'word': 'Tony Stark',
  'start': 11,
  'end': 21},
 {'entity_group': 'ORG',
  'score': 0.998356,
  'word': 'Stark Industries',
  'start': 36,
  'end': 52},
 {'entity_group': 'LOC',
  'score': 0.9982861,
  'word': 'Los Angeles',
  'start': 56,
  'end': 67}]
Downloads last month
0
Safetensors
Model size
65.2M params
Tensor type
F32
·

Dataset used to train rasyosef/distilbert-finetuned-ner