metadata
license: apache-2.0
language:
- zh
- en
base_model:
- ICTNLP/CSLM-base
CSLM-SFT
π‘ Model Description
CSLM-SFT is a supervised fine-tuned checkpoint of CSLM from the ACL 2026 Findings project Efficient Training for Cross-lingual Speech Language Models.
It is built on top of the CSLM-base model and further tuned with instruction-style data for speech-to-speech conversation, including monolingual and cross-lingual scenarios.
Paper: https://arxiv.org/abs/2604.11096
π Citation
If you use this model, please cite:
@misc{zhou2026efficienttrainingcrosslingualspeech,
title={Efficient Training for Cross-lingual Speech Language Models},
author={Yan Zhou and Qingkai Fang and Yun Hong and Yang Feng},
year={2026},
eprint={2604.11096},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2604.11096},
}
βοΈ Contact
For questions, contact: zhouyan23z@ict.ac.cn