Qian Liu

SivilTaram

AI & ML interests

Cooking cool things

Articles

Organizations

Posts 1

view post
Post
โš“๏ธ Sailor: A New Multilingual Open LLM for South-East Asia ๐ŸŒ

Last month we have released a new family of multilingual language models called **Sailor**, ranging from 0.5B to 7B parameters, continually pre-trained from the Qwen1.5 models. Based on our extensive benchmarking, the Sailor models demonstrate exceptional performance on South-East Asian languages, taking us one step closer to multilingual LLMs that can serve the diverse needs of the region and beyond.

Today, we're more than excited to share the key technical details behind the Sailor models! ๐Ÿ’ช

**Key highlights**:
๐Ÿ” Data curation: Merging short examples, document-level code-switching, aggressive data cleaning and deduplication.
๐Ÿค– Tokenization Robustness: We find that BPE dropout is really effective to deal with prompt variations.
๐Ÿ” Optimizing Data Mixture: We propose a new approach to automatically balance capabilities across different languages!
๐ŸŒŸ Recipe in Continual Pre-training: We discover a powerful metric that can help predict how well the Sailor models will perform on the original domain (e.g., English) after continual pre-training.

We are thrilled to share these technical details with the community and invite you to explore the Sailor models. We hope Sailor models take us one step closer to multilingual LLMs in the world! ๐ŸŒโœจ

To learn more, please access our research paper or reach out to our team.
๐Ÿ”— Paper: Sailor: Open Language Models for South-East Asia (2404.03608)
๐Ÿงฉ Model: sail/sailor-language-models-65e19a749f978976f1959825
๐Ÿ’ป Code: https://github.com/sail-sg/sailor-llm