Edit model card

SeaLLM-7B-v2.5 - Large Language Models for Southeast Asia

Technical Blog    ๐Ÿค— Tech Memo    ๐Ÿค— DEMO    Github    Technical Report

We introduce SeaLLM-7B-v2.5, the state-of-the-art multilingual LLM for Southeast Asian (SEA) languages ๐Ÿ‡ฌ๐Ÿ‡ง ๐Ÿ‡จ๐Ÿ‡ณ ๐Ÿ‡ป๐Ÿ‡ณ ๐Ÿ‡ฎ๐Ÿ‡ฉ ๐Ÿ‡น๐Ÿ‡ญ ๐Ÿ‡ฒ๐Ÿ‡พ ๐Ÿ‡ฐ๐Ÿ‡ญ ๐Ÿ‡ฑ๐Ÿ‡ฆ ๐Ÿ‡ฒ๐Ÿ‡ฒ ๐Ÿ‡ต๐Ÿ‡ญ. It is the most significant upgrade since SeaLLM-13B, with half the size, outperforming performance across diverse multilingual tasks, from world knowledge, math reasoning, instruction following, etc.

This is Q4 quantized version for MLX for apple silicon. Checkout SeaLLM-7B-v2.5 page for more details.

Citation

If you find our project useful, we hope you would kindly star our repo and cite our work as follows: Corresponding Author: l.bing@alibaba-inc.com

Author list and order will change!

  • * and ^ are equal contributions.
@article{damonlpsg2023seallm,
  author = {Xuan-Phi Nguyen*, Wenxuan Zhang*, Xin Li*, Mahani Aljunied*, Weiwen Xu, Hou Pong Chan,
            Zhiqiang Hu, Chenhui Shen^, Yew Ken Chia^, Xingxuan Li, Jianyu Wang,
            Qingyu Tan, Liying Cheng, Guanzheng Chen, Yue Deng, Sen Yang,
            Chaoqun Liu, Hang Zhang, Lidong Bing},
  title = {SeaLLMs - Large Language Models for Southeast Asia},
  year = 2023,
  Eprint = {arXiv:2312.00738},
}
Downloads last month
51
Safetensors
Model size
2B params
Tensor type
FP16
ยท
U32
ยท

Collections including SeaLLMs/SeaLLM-7B-v2.5-mlx-quantized