pub-llama-13B-v5 / README.md
DopeorNope's picture
Update README.md
aa58bbe
metadata
language:
  - ko
datasets: DopeorNope/OpenOrca-near-dedup-v1
license: cc-by-nc-sa-4.0

(주)미디어그룹사람과숲과 (주)마커의 LLM 연구 컨소시엄에서 개발된 모델입니다
The license is cc-by-nc-sa.

Model Details

Model Developers SeungyooLee (DopeorNopeLee)

Input Models input text only.

Output Models generate text only.

Model Architecture
pub-llama-13b-v5 is an auto-regressive language model based on the LLaMA2 transformer architecture.

Repo Link
Github: pub-llama📑

Training Dataset
DopeorNope/OpenOrca-near-dedup-v1 dataset was created by Near dedup algorithm to reduce similarity. We will open it soon.