File size: 974 Bytes
e902507
5b12551
 
 
 
 
e902507
5b12551
 
 
 
0ba7e5a
5b12551
446e1b4
5b12551
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
---
language: 
- zh
tags:
- bert
license: "apache-2.0"
---

# Please use 'Bert' related functions to load this model!

## Chinese small pre-trained model MiniRBT
In order to further promote the research and development of Chinese information processing, we launched a Chinese small pre-training model MiniRBT based on the self-developed knowledge distillation tool TextBrewer, combined with Whole Word Masking technology and Knowledge Distillation technology.

This repository is developed based on:https://github.com/iflytek/MiniRBT

You may also interested in,
- Chinese LERT: https://github.com/ymcui/LERT
- Chinese PERT: https://github.com/ymcui/PERT
- Chinese MacBERT: https://github.com/ymcui/MacBERT
- Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA
- Chinese XLNet: https://github.com/ymcui/Chinese-XLNet
- Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer

More resources by HFL: https://github.com/iflytek/HFL-Anthology