Simbolo commited on
Commit
547092f
1 Parent(s): 362d3db

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -42,7 +42,7 @@ Releasing the Model: Eithandaraung, Ye Yint Htut, Thet Chit Su, Naing Phyo Aung,
42
 
43
  ### Acknowledgment
44
  We extend our gratitude to the creators of the [mGPT-XL](ai-forever/mGPT) models for their invaluable contribution to this project, significantly impacting the field of Burmese NLP.
45
- We want to thank everyone who has worked on the related works, especially [Minsithu](https://huggingface.co/jojo-ai-mst/MyanmarGPTT) and [Dr. Wai Yan Nyein Naing](WYNN747/Burmese-GPT, https://huggingface.co/WYNN747/Burmese-GPT)who initiated the work of gpt-2 model.
46
  And We would like to thank Simbolo:Servico which is a branch of Simbolo under the company of Intello Tech for providing financial support.
47
 
48
  ### Limitations and bias
@@ -52,7 +52,7 @@ We have yet to investigate the potential bias inherent in this model thoroughly.
52
  1. Jiang, Shengyi & Huang, Xiuwen & Cai, Xiaonan & Lin, Nankai. (2021). Pre-trained Models and Evaluation Data for the Myanmar Language. 10.1007/978-3-030-92310-5_52.
53
  2. Lin, N., Fu, Y., Chen, C., Yang, Z., & Jiang, S. (2021). LaoPLM: Pre-trained Language Models for Lao. ArXiv. /abs/2110.05896
54
  3. MinSithu, MyanmarGPT, https://huggingface.co/jojo-ai-mst/MyanmarGPT, 1.1-SweptWood
55
- 4. Dr. Wai Yan Nyein Naing, WYNN747/Burmese-GPT, https://huggingface.co/WYNN747/Burmese-GPT
56
  5. Sai Htaung Kham,saihtaungkham/BurmeseRoBERTaCLM
57
  6. Shliazhko, O., Fenogenova, A., Tikhonova, M., Mikhailov, V., Kozlova, A., & Shavrina, T. (2022). MGPT: Few-Shot Learners Go Multilingual. ArXiv. /abs/2204.07580
58
 
 
42
 
43
  ### Acknowledgment
44
  We extend our gratitude to the creators of the [mGPT-XL](ai-forever/mGPT) models for their invaluable contribution to this project, significantly impacting the field of Burmese NLP.
45
+ We want to thank everyone who has worked on the related works, especially [Minsithu](https://huggingface.co/jojo-ai-mst/MyanmarGPTT) and [WaiYanNyeinNaing](WYNN747/Burmese-GPT, https://huggingface.co/WYNN747/Burmese-GPT)who initiated the work of gpt-2 model.
46
  And We would like to thank Simbolo:Servico which is a branch of Simbolo under the company of Intello Tech for providing financial support.
47
 
48
  ### Limitations and bias
 
52
  1. Jiang, Shengyi & Huang, Xiuwen & Cai, Xiaonan & Lin, Nankai. (2021). Pre-trained Models and Evaluation Data for the Myanmar Language. 10.1007/978-3-030-92310-5_52.
53
  2. Lin, N., Fu, Y., Chen, C., Yang, Z., & Jiang, S. (2021). LaoPLM: Pre-trained Language Models for Lao. ArXiv. /abs/2110.05896
54
  3. MinSithu, MyanmarGPT, https://huggingface.co/jojo-ai-mst/MyanmarGPT, 1.1-SweptWood
55
+ 4. Wai Yan Nyein Naing, WYNN747/Burmese-GPT, https://huggingface.co/WYNN747/Burmese-GPT
56
  5. Sai Htaung Kham,saihtaungkham/BurmeseRoBERTaCLM
57
  6. Shliazhko, O., Fenogenova, A., Tikhonova, M., Mikhailov, V., Kozlova, A., & Shavrina, T. (2022). MGPT: Few-Shot Learners Go Multilingual. ArXiv. /abs/2204.07580
58