yentinglin commited on
Commit
4ecef00
β€’
1 Parent(s): 4d31b4c

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +10 -0
app.py CHANGED
@@ -6,6 +6,16 @@ from transformers import AutoTokenizer
6
  DESCRIPTION = """
7
  # Language Models for Taiwanese Culture
8
 
 
 
 
 
 
 
 
 
 
 
9
  Taiwan-LLaMa is a fine-tuned model specifically designed for traditional Chinese applications. It is built upon the LLaMa 2 architecture and includes a pretraining phase with over 5 billion tokens and fine-tuning with over 490k multi-turn conversational data in Traditional Chinese.
10
 
11
  ## Key Features
 
6
  DESCRIPTION = """
7
  # Language Models for Taiwanese Culture
8
 
9
+ <p align="center">
10
+ ✍️ <a href="https://huggingface.co/spaces/yentinglin/Taiwan-LLaMa2" target="_blank">Online Demo</a>
11
+ β€’
12
+ πŸ€— <a href="https://huggingface.co/yentinglin" target="_blank">HF Repo</a> β€’ 🐦 <a href="https://twitter.com/yentinglin56" target="_blank">Twitter</a> β€’ πŸ“ƒ <a href="https://arxiv.org/pdf/2305.13711.pdf" target="_blank">[Paper Coming Soon]</a>
13
+ β€’ πŸ‘¨οΈ <a href="https://yentingl.com/" target="_blank">Yen-Ting Lin</a>
14
+ <br/><br/>
15
+ <img src="https://www.csie.ntu.edu.tw/~miulab/taiwan-llama/logo-v2.png" width="100"> <br/>
16
+ </p>
17
+
18
+
19
  Taiwan-LLaMa is a fine-tuned model specifically designed for traditional Chinese applications. It is built upon the LLaMa 2 architecture and includes a pretraining phase with over 5 billion tokens and fine-tuning with over 490k multi-turn conversational data in Traditional Chinese.
20
 
21
  ## Key Features