Go Inoue commited on
Commit
7e054be
1 Parent(s): 917be09
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -36,7 +36,7 @@ We release our fine-tuninig code [here](https://github.com/CAMeL-Lab/CAMeLBERT).
36
  You can use this model directly with a pipeline for masked language modeling:
37
  ```python
38
  >>> from transformers import pipeline
39
- >>> unmasker = pipeline('fill-mask', model='bert-base-camelbert-ca')
40
  >>> unmasker("الهدف من الحياة هو [MASK] .")
41
  [{'sequence': '[CLS] الهدف من الحياة هو الحياة. [SEP]',
42
  'score': 0.11048116534948349,
@@ -63,8 +63,8 @@ You can use this model directly with a pipeline for masked language modeling:
63
  Here is how to use this model to get the features of a given text in PyTorch:
64
  ```python
65
  from transformers import AutoTokenizer, AutoModel
66
- tokenizer = AutoTokenizer.from_pretrained('bert-base-camelbert-ca')
67
- model = AutoModel.from_pretrained('bert-base-camelbert-ca')
68
  text = "مرحبا يا عالم."
69
  encoded_input = tokenizer(text, return_tensors='pt')
70
  output = model(**encoded_input)
@@ -73,8 +73,8 @@ output = model(**encoded_input)
73
  and in TensorFlow:
74
  ```python
75
  from transformers import AutoTokenizer, TFAutoModel
76
- tokenizer = AutoTokenizer.from_pretrained('bert-base-camelbert-ca')
77
- model = TFAutoModel.from_pretrained('bert-base-camelbert-ca')
78
  text = "مرحبا يا عالم."
79
  encoded_input = tokenizer(text, return_tensors='tf')
80
  output = model(encoded_input)
36
  You can use this model directly with a pipeline for masked language modeling:
37
  ```python
38
  >>> from transformers import pipeline
39
+ >>> unmasker = pipeline('fill-mask', model='CAMeL-Lab/bert-base-camelbert-ca')
40
  >>> unmasker("الهدف من الحياة هو [MASK] .")
41
  [{'sequence': '[CLS] الهدف من الحياة هو الحياة. [SEP]',
42
  'score': 0.11048116534948349,
63
  Here is how to use this model to get the features of a given text in PyTorch:
64
  ```python
65
  from transformers import AutoTokenizer, AutoModel
66
+ tokenizer = AutoTokenizer.from_pretrained('CAMeL-Lab/bert-base-camelbert-ca')
67
+ model = AutoModel.from_pretrained('CAMeL-Lab/bert-base-camelbert-ca')
68
  text = "مرحبا يا عالم."
69
  encoded_input = tokenizer(text, return_tensors='pt')
70
  output = model(**encoded_input)
73
  and in TensorFlow:
74
  ```python
75
  from transformers import AutoTokenizer, TFAutoModel
76
+ tokenizer = AutoTokenizer.from_pretrained('CAMeL-Lab/bert-base-camelbert-ca')
77
+ model = TFAutoModel.from_pretrained('CAMeL-Lab/bert-base-camelbert-ca')
78
  text = "مرحبا يا عالم."
79
  encoded_input = tokenizer(text, return_tensors='tf')
80
  output = model(encoded_input)