|
--- |
|
license: apache-2.0 |
|
language: |
|
- th |
|
library_name: transformers |
|
pipeline_tag: text-generation |
|
tags: |
|
- pretrained |
|
--- |
|
# Model Card for Typhoon-7B |
|
|
|
**Typhoon 7B** is a pretrained Thai language adaption of Mistral-7B with 7 billion parameters. |
|
|
|
**Typhoon 7B** outperforms all open-source Thai language models as of this publishing, and its performance is on par with GPT-3.5 while being 2.62 times more efficient. |
|
|
|
|
|
<div align="center"> |
|
<img src="https://storage.googleapis.com/scb10x-ai-lab-public/assets/typhoon_benchmark.png" alt="Typhoon benchmark" width="100%" style="margin-left:'auto' margin-right:'auto' display:'block'"/> |
|
</div> |
|
|
|
For full details of this model please read our [paper]() and [release blog post](). |
|
|
|
|
|
## Requirements |
|
|
|
Transformers, 4.34.0 or newer. |
|
|
|
|
|
## Model date |
|
|
|
**Typhoon 7B** was trained at December, 2023. |
|
|
|
|
|
## License |
|
Apache-2.0 (Commercial) |
|
|
|
|
|
## Notice |
|
|
|
Typhoon 7B is a pretrained base model; it cannot understand human instructions without using a few-shot or fine-tune approach on an instruct dataset, |
|
and does not have any moderation mechanisms. |
|
|
|
|
|
## SCB10X AI Team |
|
|
|
Kunat Pipatanakul, Phatrasek Jirabovonvisut, Potsawee Manakul, Sittipong Sripaisarnmongkol, Ruangsak Patomwong, Pathomporn Chokchainant, Kasima Tharnpipitchai |
|
|