Simbolo commited on
Commit
9f31f81
•
1 Parent(s): 92691f4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -32,14 +32,14 @@ pipe('မြန်မာဘာသာစကား')
32
  The data utilized comprises 1 million sentences sourced from Wikipedia.
33
 
34
  ### Contributors
35
- Main Contributor: Sa Phyo Thu Htet (https://github.com/SaPhyoThuHtet)
36
  Wikipedia Data Crawling: Kaung Kaung Ko Ko, Phuu Pwint Thinzar Kyaing
37
- Releasing the Model: Eithandaraung, Ye Yint Htut, Thet Chit Su, Naing Phyo Aung
38
 
39
  ### Acknowledgment
40
  We extend our gratitude to the creators of the [mGPT-XL](ai-forever/mGPT) models for their invaluable contribution to this project, significantly impacting the field of Burmese NLP.
41
  We want to thank everyone who has worked on the related works, especially [Minsithu](https://huggingface.co/jojo-ai-mst/MyanmarGPTT) and [Dr. Wai Yan Nyein Naing](WYNN747/Burmese-GPT, https://huggingface.co/WYNN747/Burmese-GPT)who initiated the work of gpt-2 model.
42
-
43
 
44
  ### Limitations and bias
45
  We have yet to investigate the potential bias inherent in this model thoroughly. Regarding transparency, it's important to note that the model is primarily trained on data from the Unicode Burmese(Myanmar) language.
 
32
  The data utilized comprises 1 million sentences sourced from Wikipedia.
33
 
34
  ### Contributors
35
+ Main Contributor: [Sa Phyo Thu Htet](https://github.com/SaPhyoThuHtet)
36
  Wikipedia Data Crawling: Kaung Kaung Ko Ko, Phuu Pwint Thinzar Kyaing
37
+ Releasing the Model: Eithandaraung, Ye Yint Htut, Thet Chit Su, Naing Phyo Aung, Nyan Linn Phyo Zaw
38
 
39
  ### Acknowledgment
40
  We extend our gratitude to the creators of the [mGPT-XL](ai-forever/mGPT) models for their invaluable contribution to this project, significantly impacting the field of Burmese NLP.
41
  We want to thank everyone who has worked on the related works, especially [Minsithu](https://huggingface.co/jojo-ai-mst/MyanmarGPTT) and [Dr. Wai Yan Nyein Naing](WYNN747/Burmese-GPT, https://huggingface.co/WYNN747/Burmese-GPT)who initiated the work of gpt-2 model.
42
+ And We would like to thank Simbolo:Servico which is a brach of Simbolo under the company of Intello Tech for providing financial support.
43
 
44
  ### Limitations and bias
45
  We have yet to investigate the potential bias inherent in this model thoroughly. Regarding transparency, it's important to note that the model is primarily trained on data from the Unicode Burmese(Myanmar) language.