go-inoue commited on
Commit
54e2905
1 Parent(s): 89a6ec2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -5
README.md CHANGED
@@ -7,7 +7,10 @@ widget:
7
  ---
8
  # CAMeLBERT MSA NER Model
9
  ## Model description
10
- **CAMeLBERT MSA NER Model** is a Named Entity Recognition (NER) model that was built by fine-tuning the [CAMeLBERT Modern Standard Arabic (MSA)](https://huggingface.co/CAMeL-Lab/bert-base-arabic-camelbert-msa/) model. For the fine-tuning, we used the [ANERcorp](https://camel.abudhabi.nyu.edu/anercorp/) dataset. Our fine-tuning procedure and the hyperparameters we used can be found in our paper *"[The Interplay of Variant, Size, and Task Type in Arabic Pre-trained Language Models](https://arxiv.org/abs/2103.06678)."* Our fine-tuning code can be found [here](https://github.com/CAMeL-Lab/CAMeLBERT).
 
 
 
11
 
12
  ## Intended uses
13
  You can use the CAMeLBERT MSA NER model directly as part of our [CAMeL Tools](https://github.com/CAMeL-Lab/camel_tools) NER component (*recommended*) or as part of the transformers pipeline.
@@ -23,7 +26,6 @@ To use the model with the [CAMeL Tools](https://github.com/CAMeL-Lab/camel_tools
23
  >>> ['O', 'B-LOC', 'O', 'O', 'O', 'O', 'B-LOC', 'I-LOC', 'I-LOC', 'O']
24
  ```
25
 
26
-
27
  You can also use the NER model directly with a transformers pipeline:
28
  ```python
29
  >>> from transformers import pipeline
@@ -54,9 +56,8 @@ You can also use the NER model directly with a transformers pipeline:
54
  'start': 50,
55
  'end': 57}]
56
  ```
57
- *Note*: to download our models, you would need `transformers>=3.5.0`. Otherwise, you could download the models
58
-
59
-
60
 
61
  ## Citation
62
  ```bibtex
7
  ---
8
  # CAMeLBERT MSA NER Model
9
  ## Model description
10
+ **CAMeLBERT MSA NER Model** is a Named Entity Recognition (NER) model that was built by fine-tuning the [CAMeLBERT Modern Standard Arabic (MSA)](https://huggingface.co/CAMeL-Lab/bert-base-arabic-camelbert-msa/) model.
11
+ For the fine-tuning, we used the [ANERcorp](https://camel.abudhabi.nyu.edu/anercorp/) dataset.
12
+ Our fine-tuning procedure and the hyperparameters we used can be found in our paper *"[The Interplay of Variant, Size, and Task Type in Arabic Pre-trained Language Models](https://arxiv.org/abs/2103.06678).
13
+ "* Our fine-tuning code can be found [here](https://github.com/CAMeL-Lab/CAMeLBERT).
14
 
15
  ## Intended uses
16
  You can use the CAMeLBERT MSA NER model directly as part of our [CAMeL Tools](https://github.com/CAMeL-Lab/camel_tools) NER component (*recommended*) or as part of the transformers pipeline.
26
  >>> ['O', 'B-LOC', 'O', 'O', 'O', 'O', 'B-LOC', 'I-LOC', 'I-LOC', 'O']
27
  ```
28
 
 
29
  You can also use the NER model directly with a transformers pipeline:
30
  ```python
31
  >>> from transformers import pipeline
56
  'start': 50,
57
  'end': 57}]
58
  ```
59
+ *Note*: to download our models, you would need `transformers>=3.5.0`.
60
+ Otherwise, you could download the models manually.
 
61
 
62
  ## Citation
63
  ```bibtex