hfl
/

Safetensors
Chinese
English
hfl-rc's picture
Update README.md
1f8200b verified
|
raw
history blame
1.44 kB
metadata
base_model: mistralai/Mixtral-8x7B-v0.1
inference: false
license: apache-2.0
language:
  - zh
  - en

Chinese-Mixtral-LoRA

Chinese Mixtral GitHub repository: https://github.com/ymcui/Chinese-Mixtral

This repository contains Chinese-Mixtral-LoRA, which is further pre-trained on Mixtral-8x7B-v0.1.

Note: You must combine LoRA with the original Mixtral-8x7B-v0.1 to obtain full weight.

Others

Citation

Please consider cite our paper if you use the resource of this repository. Paper link: https://arxiv.org/abs/2403.01851

@article{chinese-mixtral,
      title={Rethinking LLM Language Adaptation: A Case Study on Chinese Mixtral}, 
      author={Cui, Yiming and Yao, Xin},
      journal={arXiv preprint arXiv:2403.01851},
      url={https://arxiv.org/abs/2403.01851},
      year={2024}
}