YC-Chen commited on
Commit
f576567
1 Parent(s): f436e2b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -3
README.md CHANGED
@@ -9,16 +9,19 @@ This model incorporates an additional 30k TC vocabularies to better adapt to TC
9
  Breeze-7B-Instruct-v0.1 performs well on both EN and TC benchmarks.
10
  This model outperforms Taiwan-LLM-7B-v2.1-chat, Taiwan-LLM-13B-v2.0-chat, and Yi-6B-Chat on major TC benchmarks we tested, and is comparable with Mistral-7B-Instruct on the Open LLM Leaderboard.
11
 
 
 
12
  ## Features
13
 
14
  - Expanding the vocabulary dictionary for Traditional Chinese from 32k to 62k vocabulary size (the first successful work in Traditional Chinese)
15
- - Multi-turn dialogue without special handling for harmful content
16
  - 8k context length
17
- - Grouped-query attention
18
- - Sliding-window attention
19
 
20
  ## Model Details
21
  - **Finetuned from:** [MediaTek-Research/Breeze-7B-Base-v0.1](https://huggingface.co/MediaTek-Research/Breeze-7B-Base-v0.1)
22
  - **Model type:** Causal decoder-only transformer language model
23
  - **Language:** English and Traditional Chinese (zh-tw)
24
 
 
 
 
9
  Breeze-7B-Instruct-v0.1 performs well on both EN and TC benchmarks.
10
  This model outperforms Taiwan-LLM-7B-v2.1-chat, Taiwan-LLM-13B-v2.0-chat, and Yi-6B-Chat on major TC benchmarks we tested, and is comparable with Mistral-7B-Instruct on the Open LLM Leaderboard.
11
 
12
+ *A project by the members (in alphabetical order): Chan-Jan Hsu 許湛然, Chang-Le Liu 劉昶樂, Po-Chun Hsu 許博竣, Feng-Ting Liao 廖峰挺, Yi-Chang Chen 陳宜昌, and the supervisor Da-Shan Shiu 許大山.*
13
+
14
  ## Features
15
 
16
  - Expanding the vocabulary dictionary for Traditional Chinese from 32k to 62k vocabulary size (the first successful work in Traditional Chinese)
17
+ - Multi-turn dialogue without special handling for harmfulness
18
  - 8k context length
19
+ - Grouped-query and sliding-window attention
 
20
 
21
  ## Model Details
22
  - **Finetuned from:** [MediaTek-Research/Breeze-7B-Base-v0.1](https://huggingface.co/MediaTek-Research/Breeze-7B-Base-v0.1)
23
  - **Model type:** Causal decoder-only transformer language model
24
  - **Language:** English and Traditional Chinese (zh-tw)
25
 
26
+ ## Performance
27
+