princeton-nlp commited on
Commit
b906e1b
1 Parent(s): 96bce0c

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -0
README.md ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ inference: false
3
+ ---
4
+ This is a model checkpoint for ["Should You Mask 15% in Masked Language Modeling"](https://arxiv.org/abs/2202.08005) [(code)](https://github.com/princeton-nlp/DinkyTrain.git). We use pre layer norm, which is not supported by HuggingFace. To use our model, go to our [github repo](https://github.com/princeton-nlp/DinkyTrain.git), download our code, and import the RoBERTa class from `huggingface/modeling_roberta_prelayernorm.py`. For example,
5
+
6
+ ``` bash
7
+ from huggingface.modeling_roberta_prelayernorm import RobertaForMaskedLM, RobertaForSequenceClassification
8
+ ```