davda54 commited on
Commit
a7f1bf6
·
verified ·
1 Parent(s): adff286

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -38,7 +38,7 @@ This model uses a new tokenizer, specially trained on the target languages. Ther
38
 
39
  ## NorMistral-11b is also a bidirectional masked language model
40
 
41
- Having been pretrained on a mixed causal-masked objective, this model knows how to process texts bidirectionally. You can thus finetune this model like any other BERT-like model (or any other prefix language model). The model can also be used directly for masked language modeling:
42
 
43
  ```python
44
  from transformers import AutoTokenizer, AutoModelForCausalLM
 
38
 
39
  ## NorMistral-11b is also a bidirectional masked language model
40
 
41
+ Having been pretrained on a mixed causal-masked objective, this model knows how to process texts bidirectionally. You can thus finetune this model like any other BERT (or any other prefix language model). The model can also be used directly for masked language modeling:
42
 
43
  ```python
44
  from transformers import AutoTokenizer, AutoModelForCausalLM