File size: 951 Bytes
86f26b4
 
 
 
85c8e9b
86f26b4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7fbfa53
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
---
language: zh
---

## bert-chinese-homie-large

This is a Chinese pre-training model BERT, pre-trained on a large-scale corpus. It is suitable for fine-tuning on specific downstream tasks, or as a parameter initialization for pre-training, which can improve performance. Due to excessive alchemy, it is not suitable for Fill Mask directly, unless you have performed a small amount of pre-training.

I don't know what homie means, but someone calls me that. I believe this is an interesting natural language processing. I don't know what homie means, but I've been called that and I can feel the meaning. 

## Bibtex entry and citation info
Please cite if you find it helpful.
```
@article{zhu2023metaaid,
  title={MetaAID 2.0: An Extensible Framework for Developing Metaverse Applications via Human-controllable Pre-trained Models},
  author={Zhu, Hongyin},
  journal={arXiv preprint arXiv:2302.13173},
  year={2023}
}

```

---
license: other
---