minirbt-h256 / README.md
iMountTai's picture
init
5b12551
|
raw
history blame
951 Bytes
metadata
language:
  - zh
tags:
  - bert
license: apache-2.0

Please use 'Bert' related functions to load this model!

Chinese small pre-trained model MiniRBT

For further accelerating Chinese natural language processing, we launched a Chinese small pre-training model MiniRBT based on the self-developed knowledge distillation tool TextBrewer, combined with Whole Word Masking technology and Knowledge Distillation technology.

This repository is developed based on:https://github.com/iflytek/ta-minilm-demo

You may also interested in,

More resources by HFL: https://github.com/iflytek/HFL-Anthology