Edit model card

This is a model checkpoint for "Should You Mask 15% in Masked Language Modeling" (code).

The original checkpoint is avaliable at princeton-nlp/efficient_mlm_m0.40. Unfortunately this checkpoint depends on code that isn't part of the official transformers library. Additionally, the checkpoints contains unused weights due to a bug.

This checkpoint fixes the unused weights issue and uses the RobertaPreLayerNorm model from the transformers library.

Downloads last month
1,422
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for andreasmadsen/efficient_mlm_m0.40

Finetunes
1 model