princeton-nlp
commited on
Commit
•
890fc0c
1
Parent(s):
c442919
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,8 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
inference: false
|
3 |
+
---
|
4 |
+
This is a model checkpoint for ["Should You Mask 15% in Masked Language Modeling"](https://arxiv.org/abs/2202.08005) [(code)](https://github.com/princeton-nlp/DinkyTrain.git). We use pre layer norm, which is not supported by HuggingFace. To use our model, go to our [github repo](https://github.com/princeton-nlp/DinkyTrain.git), download our code, and import the RoBERTa class from `huggingface/modeling_roberta_prelayernorm.py`. For example,
|
5 |
+
|
6 |
+
``` bash
|
7 |
+
from huggingface.modeling_roberta_prelayernorm import RobertaForMaskedLM, RobertaForSequenceClassification
|
8 |
+
```
|