Simbolo commited on
Commit
92691f4
1 Parent(s): 2d55dc4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -2
README.md CHANGED
@@ -36,13 +36,18 @@ Main Contributor: Sa Phyo Thu Htet (https://github.com/SaPhyoThuHtet)
36
  Wikipedia Data Crawling: Kaung Kaung Ko Ko, Phuu Pwint Thinzar Kyaing
37
  Releasing the Model: Eithandaraung, Ye Yint Htut, Thet Chit Su, Naing Phyo Aung
38
 
 
 
 
 
39
 
40
  ### Limitations and bias
41
- We have yet to thoroughly investigate the potential bias inherent in this model. Regarding transparency, it's important to note that the model is primarily trained on data from the Unicode Burmese(Myanmar) language.
42
 
43
  ### References
44
  1. Jiang, Shengyi & Huang, Xiuwen & Cai, Xiaonan & Lin, Nankai. (2021). Pre-trained Models and Evaluation Data for the Myanmar Language. 10.1007/978-3-030-92310-5_52.
45
  2. Lin, N., Fu, Y., Chen, C., Yang, Z., & Jiang, S. (2021). LaoPLM: Pre-trained Language Models for Lao. ArXiv. /abs/2110.05896
46
  3. MinSithu, MyanmarGPT, https://huggingface.co/jojo-ai-mst/MyanmarGPT, 1.1-SweptWood
47
  4. Dr. Wai Yan Nyein Naing, WYNN747/Burmese-GPT, https://huggingface.co/WYNN747/Burmese-GPT
48
- 5. Sai Htaung Kham,saihtaungkham/BurmeseRoBERTaCLM
 
 
36
  Wikipedia Data Crawling: Kaung Kaung Ko Ko, Phuu Pwint Thinzar Kyaing
37
  Releasing the Model: Eithandaraung, Ye Yint Htut, Thet Chit Su, Naing Phyo Aung
38
 
39
+ ### Acknowledgment
40
+ We extend our gratitude to the creators of the [mGPT-XL](ai-forever/mGPT) models for their invaluable contribution to this project, significantly impacting the field of Burmese NLP.
41
+ We want to thank everyone who has worked on the related works, especially [Minsithu](https://huggingface.co/jojo-ai-mst/MyanmarGPTT) and [Dr. Wai Yan Nyein Naing](WYNN747/Burmese-GPT, https://huggingface.co/WYNN747/Burmese-GPT)who initiated the work of gpt-2 model.
42
+
43
 
44
  ### Limitations and bias
45
+ We have yet to investigate the potential bias inherent in this model thoroughly. Regarding transparency, it's important to note that the model is primarily trained on data from the Unicode Burmese(Myanmar) language.
46
 
47
  ### References
48
  1. Jiang, Shengyi & Huang, Xiuwen & Cai, Xiaonan & Lin, Nankai. (2021). Pre-trained Models and Evaluation Data for the Myanmar Language. 10.1007/978-3-030-92310-5_52.
49
  2. Lin, N., Fu, Y., Chen, C., Yang, Z., & Jiang, S. (2021). LaoPLM: Pre-trained Language Models for Lao. ArXiv. /abs/2110.05896
50
  3. MinSithu, MyanmarGPT, https://huggingface.co/jojo-ai-mst/MyanmarGPT, 1.1-SweptWood
51
  4. Dr. Wai Yan Nyein Naing, WYNN747/Burmese-GPT, https://huggingface.co/WYNN747/Burmese-GPT
52
+ 5. Sai Htaung Kham,saihtaungkham/BurmeseRoBERTaCLM
53
+ 6. Shliazhko, O., Fenogenova, A., Tikhonova, M., Mikhailov, V., Kozlova, A., & Shavrina, T. (2022). MGPT: Few-Shot Learners Go Multilingual. ArXiv. /abs/2204.07580