File size: 970 Bytes
d376499
 
 
 
 
 
 
 
800a6a4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
---
license: apache-2.0
language:
- th
library_name: transformers
pipeline_tag: text-generation
tags:
- pretrained
---
# Model Card for Typhoon-7B

Typhoon 7B is a pretrained Thai language adaption of Mistral-7B with 7 billion parameters.

Typhoon 7B outperforms all open-source Thai language models, and its performance is on par with GPT-3.5 while being 2.62 times more efficient.

[SHOW_RESULT_IMAGE_HERE]

For full details of this model please read our [paper]() and [release blog post]().


## Requirements

Transformers, 4.34.0 or newer.


## License
Apache-2.0 (Commercial)


## Notice

Typhoon 7B is a pretrained base model; it cannot understand human instructions without using a few-shot or fine-tune approach on an instruct dataset,
and does not have any moderation mechanisms.


## SCB10X AI Team
 
Kunat Pipatanakul, Phatrasek Jirabovonvisut, Potsawee Manakul, Sittipong Sripaisarnmongkol, Ruangsak Patomwong, Pathomporn Chokchainant, Kasima Tharnpipitchai