Chinese-Mixtral-Instruct-LoRA

Chinese Mixtral GitHub repository: https://github.com/ymcui/Chinese-Mixtral

This repository contains Chinese-Mixtral-Instruct-LoRA, which is further tuned with instruction data on Chinese-Mixtral, where Chinese-Mixtral is build on top of Mixtral-8x7B-v0.1.

Note: You must combine LoRA with the original Mixtral-8x7B-v0.1 to obtain full weight.

Others

Citation

Please consider cite our paper if you use the resource of this repository. Paper link: https://arxiv.org/abs/2403.01851

@article{chinese-mixtral,
      title={Rethinking LLM Language Adaptation: A Case Study on Chinese Mixtral}, 
      author={Cui, Yiming and Yao, Xin},
      journal={arXiv preprint arXiv:2403.01851},
      url={https://arxiv.org/abs/2403.01851},
      year={2024}
}
Downloads last month
0
Inference API (serverless) has been turned off for this model.

Finetuned from

Collection including hfl/chinese-mixtral-instruct-lora