Tirendaz commited on
Commit
874c5b1
1 Parent(s): 271d883

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +16 -10
README.md CHANGED
@@ -1,8 +1,6 @@
1
  ---
2
  license: mit
3
  base_model: xlm-roberta-base
4
- tags:
5
- - generated_from_trainer
6
  datasets:
7
  - xtreme
8
  metrics:
@@ -35,6 +33,8 @@ model-index:
35
  - name: Accuracy
36
  type: accuracy
37
  value: 0.9194332683336213
 
 
38
  ---
39
 
40
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -50,19 +50,25 @@ It achieves the following results on the evaluation set:
50
  - F1: 0.8057
51
  - Accuracy: 0.9194
52
 
53
- ## Model description
54
 
55
- More information needed
56
 
57
- ## Intended uses & limitations
58
 
59
- More information needed
 
 
60
 
61
- ## Training and evaluation data
 
62
 
63
- More information needed
 
64
 
65
- ## Training procedure
 
 
66
 
67
  ### Training hyperparameters
68
 
@@ -91,4 +97,4 @@ The following hyperparameters were used during training:
91
  - Transformers 4.33.0
92
  - Pytorch 2.0.0
93
  - Datasets 2.1.0
94
- - Tokenizers 0.13.3
 
1
  ---
2
  license: mit
3
  base_model: xlm-roberta-base
 
 
4
  datasets:
5
  - xtreme
6
  metrics:
 
33
  - name: Accuracy
34
  type: accuracy
35
  value: 0.9194332683336213
36
+ language:
37
+ - en
38
  ---
39
 
40
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
50
  - F1: 0.8057
51
  - Accuracy: 0.9194
52
 
53
+ ## Intended uses & limitations
54
 
55
+ #### How to use
56
 
57
+ You can use this model with Transformers *pipeline* for NER.
58
 
59
+ ```python
60
+ from transformers import AutoTokenizer, AutoModelForTokenClassification
61
+ from transformers import pipeline
62
 
63
+ tokenizer = AutoTokenizer.from_pretrained("dslim/bert-base-NER")
64
+ model = AutoModelForTokenClassification.from_pretrained("dslim/bert-base-NER")
65
 
66
+ nlp = pipeline("ner", model=model, tokenizer=tokenizer)
67
+ example = "My name is Wolfgang and I live in Berlin"
68
 
69
+ ner_results = nlp(example)
70
+ print(ner_results)
71
+ ```
72
 
73
  ### Training hyperparameters
74
 
 
97
  - Transformers 4.33.0
98
  - Pytorch 2.0.0
99
  - Datasets 2.1.0
100
+ - Tokenizers 0.13.3