yentinglin commited on
Commit
122a349
1 Parent(s): e55b96a

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +99 -0
README.md ADDED
@@ -0,0 +1,99 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ ---
3
+ # For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1
4
+ # Doc / guide: https://huggingface.co/docs/hub/model-cards
5
+ license: apache-2.0
6
+ language:
7
+ - zh
8
+ widget:
9
+ - text: >-
10
+ A chat between a curious user and an artificial intelligence assistant.
11
+ The assistant gives helpful, detailed, and polite answers to the user's
12
+ questions. USER: 你好,請問你可以幫我寫一封推薦信嗎? ASSISTANT:
13
+ library_name: transformers
14
+ pipeline_tag: text-generation
15
+ extra_gated_heading: Acknowledge the license to accept the repository.
16
+ extra_gated_prompt: Please contact the author for access.
17
+ extra_gated_button_content: Acknowledge license 同意以上內容
18
+ extra_gated_fields:
19
+ Name: text
20
+ Mail: text
21
+ Organization: text
22
+ Country: text
23
+ Any utilization of the Taiwan LLM repository mandates the explicit acknowledgment and attribution to the original author: checkbox
24
+ ---
25
+
26
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5df9c78eda6d0311fd3d541f/mIie6Mc6k_Uv9UZKXC_hw.png)
27
+
28
+ # 🌟 Checkout [Taiwan-LLM Demo Chat-UI](http://www.twllm.com) 🌟
29
+
30
+ # Model Card for Taiwan LLM 8x7B-DPO
31
+
32
+ Taiwan LLM is an advanced language model tailored for Traditional Chinese, focusing on the linguistic and cultural contexts of Taiwan.
33
+
34
+
35
+ ## Model description
36
+
37
+ - **Model type:** A 8x7B parameter Mixtral MoE model fine-tuned on a mix of publicly available, synthetic datasets.
38
+ - **Language(s) (NLP):** Primarily Traditional Chinese (zh-tw)
39
+ - **Finetuned from model:** [yentinglin/Taiwan-LLM-MoE-alpha](https://huggingface.co/yentinglin/Taiwan-LLM-MoE-alpha)
40
+
41
+ ### Model Sources
42
+
43
+ <!-- Provide the basic links for the model. -->
44
+
45
+ - **Repository:** https://github.com/MiuLab/Taiwan-LLaMa
46
+ - **Demo:** https://twllm.com/
47
+
48
+ ## Performance
49
+
50
+ Checkout leaderboard in [Tw Chatbot Arena](https://arena.twllm.com/)
51
+
52
+ TMMLUS+ score:
53
+ - yentinglin/Taiwan-LLM-MoE-alpha: 43.93
54
+ - yentinglin/Taiwan-LLM-8x7B-DPO: TBD
55
+
56
+ ## Intended uses
57
+
58
+ Here's how you can run the model using the `pipeline()` function from 🤗 Transformers:
59
+
60
+ ```python
61
+ # pip install transformers>=4.34
62
+ # pip install accelerate
63
+
64
+ import torch
65
+ from transformers import pipeline
66
+
67
+ pipe = pipeline("text-generation", model="yentinglin/Taiwan-LLM-8x7B-DPO", torch_dtype=torch.bfloat16, device_map="auto")
68
+
69
+ # We use the tokenizer's chat template to format each message - see https://huggingface.co/docs/transformers/main/en/chat_templating
70
+ messages = [
71
+ {
72
+ "role": "system",
73
+ "content": "你是一個人工智慧助理",
74
+ },
75
+ {"role": "user", "content": "東北季風如何影響台灣氣候?"},
76
+ ]
77
+ prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
78
+ outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
79
+ print(outputs[0]["generated_text"])
80
+ ```
81
+
82
+ ## Citation
83
+
84
+ If you find Taiwan LLM useful in your work, please cite it with:
85
+
86
+ ```
87
+ @misc{lin2023taiwan,
88
+ title={Taiwan LLM: Bridging the Linguistic Divide with a Culturally Aligned Language Model},
89
+ author={Yen-Ting Lin and Yun-Nung Chen},
90
+ year={2023},
91
+ eprint={2311.17487},
92
+ archivePrefix={arXiv},
93
+ primaryClass={cs.CL}
94
+ }
95
+ ```
96
+
97
+ # Acknowledgement
98
+
99
+ Ubitus provides valuable compute resources for the project.