Simbolo commited on
Commit
e211c04
1 Parent(s): 948226e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -13,7 +13,8 @@ tags:
13
  ---
14
 
15
  The Simbolo's Myanmarsar-GPT symbol is trained on a dataset of 1 million Burmese data and pre-trained using the GPT-2 architecture. Its purpose is to serve as a foundational pre-trained model for the Burmese language, facilitating fine-tuning for specific applications of different tasks such as creative writing, chatbot, machine translation etc.
16
- ![myanamarsar-gpt](https://huggingface.co/Simbolo-Servicio/Myanmarsar-GPT/blob/main/smgpt.jpg)
 
17
 
18
 
19
  ### How to use
@@ -40,7 +41,7 @@ Releasing the Model: Eithandaraung, Ye Yint Htut, Thet Chit Su, Naing Phyo Aung,
40
  ### Acknowledgment
41
  We extend our gratitude to the creators of the [mGPT-XL](ai-forever/mGPT) models for their invaluable contribution to this project, significantly impacting the field of Burmese NLP.
42
  We want to thank everyone who has worked on the related works, especially [Minsithu](https://huggingface.co/jojo-ai-mst/MyanmarGPTT) and [Dr. Wai Yan Nyein Naing](WYNN747/Burmese-GPT, https://huggingface.co/WYNN747/Burmese-GPT)who initiated the work of gpt-2 model.
43
- And We would like to thank Simbolo:Servico which is a brach of Simbolo under the company of Intello Tech for providing financial support.
44
 
45
  ### Limitations and bias
46
  We have yet to investigate the potential bias inherent in this model thoroughly. Regarding transparency, it's important to note that the model is primarily trained on data from the Unicode Burmese(Myanmar) language.
 
13
  ---
14
 
15
  The Simbolo's Myanmarsar-GPT symbol is trained on a dataset of 1 million Burmese data and pre-trained using the GPT-2 architecture. Its purpose is to serve as a foundational pre-trained model for the Burmese language, facilitating fine-tuning for specific applications of different tasks such as creative writing, chatbot, machine translation etc.
16
+ ![MyanmarSar-GPT Image](https://huggingface.co/Simbolo-Servicio/Myanmarsar-GPT/blob/main/smgpt.jpg)
17
+
18
 
19
 
20
  ### How to use
 
41
  ### Acknowledgment
42
  We extend our gratitude to the creators of the [mGPT-XL](ai-forever/mGPT) models for their invaluable contribution to this project, significantly impacting the field of Burmese NLP.
43
  We want to thank everyone who has worked on the related works, especially [Minsithu](https://huggingface.co/jojo-ai-mst/MyanmarGPTT) and [Dr. Wai Yan Nyein Naing](WYNN747/Burmese-GPT, https://huggingface.co/WYNN747/Burmese-GPT)who initiated the work of gpt-2 model.
44
+ And We would like to thank Simbolo:Servico which is a branch of Simbolo under the company of Intello Tech for providing financial support.
45
 
46
  ### Limitations and bias
47
  We have yet to investigate the potential bias inherent in this model thoroughly. Regarding transparency, it's important to note that the model is primarily trained on data from the Unicode Burmese(Myanmar) language.