minirbt-h256 / README.md
hfl-rc's picture
Update README.md
446e1b4
metadata
language:
  - zh
tags:
  - bert
license: apache-2.0

Please use 'Bert' related functions to load this model!

Chinese small pre-trained model MiniRBT

In order to further promote the research and development of Chinese information processing, we launched a Chinese small pre-training model MiniRBT based on the self-developed knowledge distillation tool TextBrewer, combined with Whole Word Masking technology and Knowledge Distillation technology.

This repository is developed based on:https://github.com/iflytek/MiniRBT

You may also interested in,

More resources by HFL: https://github.com/iflytek/HFL-Anthology