File size: 359 Bytes
d268fc3 d5f08d2 5897963 d5f08d2 99bd22c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
---
language:
- zh
license: "apache-2.0"
---
### LERT
LERT is a linguistically-motivated pre-trained language model.
Further information: https://github.com/ymcui/LERT/blob/main/README_EN.md
- **LERT: A Linguistically-motivated Pre-trained Language Model**
- *Yiming Cui, Wanxiang Che, Shijin Wang, Ting Liu*
- Paper link: https://arxiv.org/abs/2211.05344
|