File size: 2,428 Bytes
bdf3a76
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a0da11b
70a37fa
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
---
pipeline_tag: text-generation
license: apache-2.0
language:
- zh
- en
datasets:
- hon9kon9ize/yue-alpaca
- indiejoseph/wikipedia-translate-zhhk-zhcn
- indiejoseph/wikipedia-zh-yue-summaries
- indiejoseph/wikipedia-zh-yue-qa
tags:
- cantonese
- yue
- hong kong
- 香港
- 廣東話
- 粵語
---

# Model Card for Breeze-7B-Cantonese-v0.1

Breeze-7B is a language model family that builds on top of [Mistral-7B](https://huggingface.co/mistralai/Mistral-7B-v0.1), specifically intended for Traditional Chinese use. Credit to [MediaTek-Research](https://huggingface.co/MediaTek-Research).<br>
Breeze-7B係一個以[Mistral-7B](https://huggingface.co/mistralai/Mistral-7B-v0.1)作為基礎,為正體中文而造嘅模型系列,由[MediaTek-Research](https://huggingface.co/MediaTek-Research)製作.

[Breeze-7B-Cantonese] derives from the base model [Breeze-7B-Base](https://huggingface.co/MediaTek-Research/Breeze-7B-Base-v0_1), with finetuning by datasets from [hon9kon9ize](https://huggingface.co/hon9kon9ize/), making the resulting model to be able to chat with Cantonese.<br>
[Breeze-7B-Cantonese] 係由基座模型 [Breeze-7B-Base](https://huggingface.co/MediaTek-Research/Breeze-7B-Base-v0_1) 衍生出黎,用[hon9kon9ize](https://huggingface.co/hon9kon9ize/) 整嘅數據集微調, 令到呢個模型可以講廣東話。

I selected the [Breeze-7B-Base] model due to its extensive vocabulary coverage tailored for Traditional Chinese. Its language style aligns closely with the nuances of Hong Kong discourse, making it a suitable choice for this project.<br>
我揀[Breeze-7B-Base]做基座模型係因為佢有正體中文嘅擴增詞表, 而且佢嘅語言根基同香港比較相似, 所以佢似乎比較適合呢個項目。


The axolotl config file [axolotl-config.yml](axolotl-config.yml) is shared for open-source purposes, allowing everyone to utilize it for training on their own.<br>
為咗人人都可以自己訓練模型,我放埋個axolotl設定檔 [axolotl-config.yml](axolotl-config.yml)出嚟當開放源始碼。

Thanks for the great datasets from [hon9kon9ize](https://huggingface.co/hon9kon9ize/) and [indiejoseph](https://huggingface.co/indiejoseph), this project owes its existence to their invaluable contributions.<br>
多得[hon9kon9ize](https://huggingface.co/hon9kon9ize/) 同 [indiejoseph](https://huggingface.co/indiejoseph) 放出嚟嘅數據集, 先至有呢個項目出現。