chinese-lert-small / README.md
hfl-rc's picture
Update README.md
69e3e69
|
raw
history blame
359 Bytes
---
language:
- zh
license: "apache-2.0"
---
### LERT
LERT is a linguistically-motivated pre-trained language model.
Further information: https://github.com/ymcui/LERT/blob/main/README_EN.md
- **LERT: A Linguistically-motivated Pre-trained Language Model**
- *Yiming Cui, Wanxiang Che, Shijin Wang, Ting Liu*
- Paper link: https://arxiv.org/abs/2211.05344