Michael Beukman commited on
Commit
2006dfb
1 Parent(s): d0611eb

Slightly improved model card

Browse files
Files changed (1) hide show
  1. README.md +17 -1
README.md CHANGED
@@ -68,8 +68,24 @@ In general, this model performed worse on the 'date' category compared to others
68
  Here are some performance details on this specific model, compared to others we trained.
69
  All of these metrics were calculated on the test set, and the seed was chosen that gave the best overall F1 score. The first three result columns are averaged over all categories, and the latter 4 provide performance broken down by category.
70
 
 
71
 
72
- | Model Name | Staring point | Evaluation Language | F1 | Precision | Recall | F1 (DATE) | F1 (LOC) | F1 (ORG) | F1 (PER) |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
73
  | -------------------------------------------------- | -------------------- | -------------------- | -------------- | -------------- | -------------- | -------------- | -------------- | -------------- | -------------- |
74
  | [xlm-roberta-base-finetuned-ner-igbo](https://huggingface.co/mbeukman/xlm-roberta-base-finetuned-ner-igbo) (This model) | [base](https://huggingface.co/xlm-roberta-base) | ibo | 86.06 | 85.20 | 86.94 | 76.00 | 86.00 | 90.00 | 87.00 |
75
  | [xlm-roberta-base-finetuned-igbo-finetuned-ner-igbo](https://huggingface.co/mbeukman/xlm-roberta-base-finetuned-igbo-finetuned-ner-igbo) | [ibo](https://huggingface.co/Davlan/xlm-roberta-base-finetuned-igbo) | ibo | 88.39 | 87.08 | 89.74 | 74.00 | 91.00 | 90.00 | 91.00 |
68
  Here are some performance details on this specific model, compared to others we trained.
69
  All of these metrics were calculated on the test set, and the seed was chosen that gave the best overall F1 score. The first three result columns are averaged over all categories, and the latter 4 provide performance broken down by category.
70
 
71
+ These models can predict the following label for a token ([source](https://huggingface.co/Davlan/xlm-roberta-large-masakhaner)):
72
 
73
+
74
+ Abbreviation|Description
75
+ -|-
76
+ O|Outside of a named entity
77
+ B-DATE |Beginning of a DATE entity right after another DATE entity
78
+ I-DATE |DATE entity
79
+ B-PER |Beginning of a person’s name right after another person’s name
80
+ I-PER |Person’s name
81
+ B-ORG |Beginning of an organisation right after another organisation
82
+ I-ORG |Organisation
83
+ B-LOC |Beginning of a location right after another location
84
+ I-LOC |Location
85
+
86
+
87
+
88
+ | Model Name | Staring point | Evaluation / Fine-tune Language | F1 | Precision | Recall | F1 (DATE) | F1 (LOC) | F1 (ORG) | F1 (PER) |
89
  | -------------------------------------------------- | -------------------- | -------------------- | -------------- | -------------- | -------------- | -------------- | -------------- | -------------- | -------------- |
90
  | [xlm-roberta-base-finetuned-ner-igbo](https://huggingface.co/mbeukman/xlm-roberta-base-finetuned-ner-igbo) (This model) | [base](https://huggingface.co/xlm-roberta-base) | ibo | 86.06 | 85.20 | 86.94 | 76.00 | 86.00 | 90.00 | 87.00 |
91
  | [xlm-roberta-base-finetuned-igbo-finetuned-ner-igbo](https://huggingface.co/mbeukman/xlm-roberta-base-finetuned-igbo-finetuned-ner-igbo) | [ibo](https://huggingface.co/Davlan/xlm-roberta-base-finetuned-igbo) | ibo | 88.39 | 87.08 | 89.74 | 74.00 | 91.00 | 90.00 | 91.00 |