ZWJYYC commited on
Commit
45226d3
1 Parent(s): d046ed4

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -0
README.md ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ Pre-trained language models (PLMs) have achieved great success in natural language processing. Most of PLMs follow the default setting of architecture hyper-parameters (e.g., the hidden dimension is a quarter of the intermediate dimension in feed-forward sub-networks) in BERT. In this paper, we adopt the one-shot Neural Architecture Search (NAS) to automatically search architecture hyper-parameters for efficient pre-trained language models (at least 6x faster than BERT-base).
2
+
3
+ AutoTinyBERT provides a model zoo that can meet different latency requirements.