ZWJYYC commited on
Commit
37bd9e3
1 Parent(s): d018a55

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -1,3 +1,6 @@
1
  ---
2
  license: other
3
  ---
 
 
 
 
1
  ---
2
  license: other
3
  ---
4
+ Pre-trained language models (PLMs) have achieved great success in natural language processing. Most of PLMs follow the default setting of architecture hyper-parameters (e.g., the hidden dimension is a quarter of the intermediate dimension in feed-forward sub-networks) in BERT. In this paper, we adopt the one-shot Neural Architecture Search (NAS) to automatically search architecture hyper-parameters for efficient pre-trained language models (at least 6x faster than BERT-base).
5
+
6
+ AutoTinyBERT provides a model zoo that can meet different latency requirements.