Llama-3-Chinese-8B-LoRA

This repository contains Llama-3-Chinese-8B-LoRA, which is further pre-trained on Meta-Llama-3-8B with 120 GB Chinese text corpora.

Note: You must combine LoRA with the original Meta-Llama-3-8B to obtain full weight.

Further details (performance, usage, etc.) should refer to GitHub project page: https://github.com/ymcui/Chinese-LLaMA-Alpaca-3

Others

Downloads last month
0
Unable to determine this model's library. Check the docs .

Finetuned from

Collection including hfl/llama-3-chinese-8b-lora