File size: 951 Bytes
e902507 5b12551 e902507 5b12551 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
---
language:
- zh
tags:
- bert
license: "apache-2.0"
---
# Please use 'Bert' related functions to load this model!
## Chinese small pre-trained model MiniRBT
For further accelerating Chinese natural language processing, we launched a Chinese small pre-training model MiniRBT based on the self-developed knowledge distillation tool TextBrewer, combined with Whole Word Masking technology and Knowledge Distillation technology.
This repository is developed based on:https://github.com/iflytek/ta-minilm-demo
You may also interested in,
- Chinese LERT: https://github.com/ymcui/LERT
- Chinese PERT: https://github.com/ymcui/PERT
- Chinese MacBERT: https://github.com/ymcui/MacBERT
- Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA
- Chinese XLNet: https://github.com/ymcui/Chinese-XLNet
- Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer
More resources by HFL: https://github.com/iflytek/HFL-Anthology |