File size: 1,897 Bytes
16ac3ab
0e4671e
 
 
16ac3ab
0e4671e
16ac3ab
 
 
0e4671e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
16ac3ab
 
 
 
 
 
15f2d91
16ac3ab
 
6ccd9cb
16ac3ab
 
 
 
46282f6
16ac3ab
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
---
language:
- en
thumbnail: url to a thumbnail used in social sharing
tags:
- token classification
license: cc
datasets:
- conll2003
model-index:
- name: sarahmiller137/distilbert-base-uncased-ft-conll2003
  results:
  - task:
      type: token-classification
      name: Token Classification
    dataset:
      name: conll2003
      type: conll2003
      config: conll2003
      split: test
    metrics:
    - name: Accuracy
      type: accuracy
      value: 0.9750189904012154
      verified: true
    - name: Precision
      type: precision
      value: 0.9802152215150602
      verified: true
    - name: Recall
      type: recall
      value: 0.9803021169462076
      verified: true
    - name: F1
      type: f1
      value: 0.9802586673049137
      verified: true
    - name: loss
      type: loss
      value: 0.10723897069692612
      verified: true
---

## Model information:
distilibert-base-uncased model finetuned using the conll2003 dataset from the datasets library. 

## Intended uses & limitations
This model is intended to be used for named entity recoginition tasks. The model will identify entities of persons, locations, organisations, and miscellaneous.  The model will predict lables based upon the CoNLL-2003 dataset.

Note that the dataset and model may not be fully represetative or suitable for all needs it is recommended that the paper for the dataset and base model card should be reviewed before using the model - 
- [CoNLL-2003](https://aclanthology.org/W03-0419)
- [distilbert](https://huggingface.co/distilbert-base-uncased)


## How to use 
Load the model from the library using the following checkpoints:
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("sarahmiller137/distilbert-base-uncased-ft-conll2003")
model = AutoModel.from_pretrained("sarahmiller137/distilbert-base-uncased-ft-conll2003")
```