ofirzaf's picture
Initial commit
f7df70a
metadata
language: en
tags: fill-mask
datasets:
  - wikipedia
  - bookcorpus

80% 1x4 Block Sparse BERT-Large (uncased) Prune OFA

This model is was created using Prune OFA method described in Prune Once for All: Sparse Pre-Trained Language Models presented in ENLSP NeurIPS Workshop 2021.

For further details on the model and its result, see our paper and our implementation available here.