Update README.md
Browse files
README.md
CHANGED
@@ -38,7 +38,7 @@ This model uses a new tokenizer, specially trained on the target languages. Ther
|
|
38 |
|
39 |
## NorMistral-11b is also a bidirectional masked language model
|
40 |
|
41 |
-
Having been pretrained on a mixed causal-masked objective, this model knows how to process texts bidirectionally. You can thus finetune this model like any other BERT
|
42 |
|
43 |
```python
|
44 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|
|
|
38 |
|
39 |
## NorMistral-11b is also a bidirectional masked language model
|
40 |
|
41 |
+
Having been pretrained on a mixed causal-masked objective, this model knows how to process texts bidirectionally. You can thus finetune this model like any other BERT (or any other prefix language model). The model can also be used directly for masked language modeling:
|
42 |
|
43 |
```python
|
44 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|