metaphors commited on
Commit
e4f73c0
1 Parent(s): 387f387

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +25 -0
README.md CHANGED
@@ -1,3 +1,28 @@
1
  ---
2
  license: apache-2.0
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
  ---
4
+
5
+ # Tibetan BERT Model
6
+
7
+ ## Citation
8
+
9
+ Please cite our [paper](https://dl.acm.org/doi/10.1145/3548608.3559255) if you use this model:
10
+
11
+ ```
12
+ @inproceedings{10.1145/3548608.3559255,
13
+ author = {Zhang, Jiangyan and Kazhuo, Deji and Gadeng, Luosang and Trashi, Nyima and Qun, Nuo},
14
+ title = {Research and Application of Tibetan Pre-Training Language Model Based on BERT},
15
+ year = {2022},
16
+ isbn = {9781450397179},
17
+ publisher = {Association for Computing Machinery},
18
+ address = {New York, NY, USA},
19
+ url = {https://doi.org/10.1145/3548608.3559255},
20
+ doi = {10.1145/3548608.3559255},
21
+ abstract = {In recent years, pre-training language models have been widely used in the field of natural language processing, but the research on Tibetan pre-training language models is still in the exploratory stage. To promote the further development of Tibetan natural language processing and effectively solve the problem of the scarcity of Tibetan annotation data sets, the article studies the Tibetan pre-training language model based on BERT. First, given the characteristics of the Tibetan language, we constructed a data set for the BERT pre-training language model and downstream text classification tasks. Secondly, construct a small-scale Tibetan BERT pre-training language model to train it. Finally, the performance of the model was verified through the downstream task of Tibetan text classification, and an accuracy rate of 86\% was achieved on the task of text classification. Experiments show that the model we built has a significant effect on the task of Tibetan text classification.},
22
+ booktitle = {Proceedings of the 2022 2nd International Conference on Control and Intelligent Robotics},
23
+ pages = {519–524},
24
+ numpages = {6},
25
+ location = {Nanjing, China},
26
+ series = {ICCIR '22}
27
+ }
28
+ ```