hfl
/

hfl-rc's picture
Update README.md
6f8efa7 verified
|
raw
history blame
1.08 kB
metadata
base_model: meta-llama/Meta-Llama-3-8B
license: apache-2.0
language:
  - zh
  - en

Llama-3-Chinese-8B-LoRA

This repository contains Llama-3-Chinese-8B-LoRA, which is further pre-trained on Meta-Llama-3-8B with 120 GB Chinese text corpora.

Note: You must combine LoRA with the original Meta-Llama-3-8B to obtain full weight.

Further details (performance, usage, etc.) should refer to GitHub project page: https://github.com/ymcui/Chinese-LLaMA-Alpaca-3

Others