balhafni commited on
Commit
f7f7b76
1 Parent(s): e1dc8ff

Updating README

Browse files
Files changed (1) hide show
  1. README.md +13 -1
README.md CHANGED
@@ -10,7 +10,7 @@ widget:
10
  **CAMeLBERT MSA NER Model** is Named Entity Recognition (NER) model that was built by fine-tuning the [CAMeLBERT Modern Standard Arabic (MSA)](https://huggingface.co/CAMeL-Lab/bert-base-arabic-camelbert-msa/) model. For the fine-tuning, we used the [ANERcorp](https://camel.abudhabi.nyu.edu/anercorp/) dataset. Our fine-tuning procedure and the hyperparameters we used can be found in our paper *"[The Interplay of Variant, Size, and Task Type in Arabic Pre-trained Language Models](https://arxiv.org/abs/2103.06678)."* Our fine-tuning code can be found [here](https://github.com/CAMeL-Lab/CAMeLBERT).
11
 
12
  ## Intended uses
13
- You can use the CAMeLBERT MSA NER Model directly as part of the transformers pipeline.
14
 
15
  #### How to use
16
  You can use this model directly with a pipeline to do NER:
@@ -43,8 +43,20 @@ You can use this model directly with a pipeline to do NER:
43
  'start': 50,
44
  'end': 57}]
45
  ```
 
 
 
 
 
 
 
 
 
 
46
  *Note*: to download our models, you would need `transformers>=3.5.0`. Otherwise, you could download the models
47
 
 
 
48
  ## Citation
49
  ```bibtex
50
  @inproceedings{inoue-etal-2021-interplay,
 
10
  **CAMeLBERT MSA NER Model** is Named Entity Recognition (NER) model that was built by fine-tuning the [CAMeLBERT Modern Standard Arabic (MSA)](https://huggingface.co/CAMeL-Lab/bert-base-arabic-camelbert-msa/) model. For the fine-tuning, we used the [ANERcorp](https://camel.abudhabi.nyu.edu/anercorp/) dataset. Our fine-tuning procedure and the hyperparameters we used can be found in our paper *"[The Interplay of Variant, Size, and Task Type in Arabic Pre-trained Language Models](https://arxiv.org/abs/2103.06678)."* Our fine-tuning code can be found [here](https://github.com/CAMeL-Lab/CAMeLBERT).
11
 
12
  ## Intended uses
13
+ You can use the CAMeLBERT MSA NER Model directly as part of the transformers pipeline or as part of our [CAMeL Tools](https://github.com/CAMeL-Lab/camel_tools) NER component.
14
 
15
  #### How to use
16
  You can use this model directly with a pipeline to do NER:
 
43
  'start': 50,
44
  'end': 57}]
45
  ```
46
+
47
+ Here is how to use this model with our CAMeLTools toolkit:
48
+ ```python
49
+ >>> from camel_tools.ner import NERecognizer
50
+ >>> ner = NERecognizer('CAMeL-Lab/bert-base-arabic-camelbert-msa-ner')
51
+ >>> sentence = 'إمارة أبوظبي هي إحدى إمارات دولة الإمارات العربية المتحدة السبع'.split()
52
+ >>> ner.predict_sentence(sentence)
53
+ >>> ['O', 'B-LOC', 'O', 'O', 'O', 'O', 'B-LOC', 'I-LOC', 'I-LOC', 'O']
54
+ ```
55
+
56
  *Note*: to download our models, you would need `transformers>=3.5.0`. Otherwise, you could download the models
57
 
58
+
59
+
60
  ## Citation
61
  ```bibtex
62
  @inproceedings{inoue-etal-2021-interplay,