hfl
/

Llama-3-Chinese-8B-LoRA

This repository contains Llama-3-Chinese-8B-LoRA, which is further pre-trained on Meta-Llama-3-8B with 120 GB Chinese text corpora.

Note: You must combine LoRA with the original Meta-Llama-3-8B to obtain full weight.

Further details (performance, usage, etc.) should refer to GitHub project page: https://github.com/ymcui/Chinese-LLaMA-Alpaca-3

Others

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for hfl/llama-3-chinese-8b-lora

Finetuned
(360)
this model

Collection including hfl/llama-3-chinese-8b-lora