File size: 718 Bytes
974b487 de90437 974b487 de90437 974b487 de90437 974b487 de90437 974b487 de90437 974b487 de90437 974b487 de90437 974b487 de90437 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 |
---
library_name: transformers
datasets:
- kyujinpy/OpenOrca-KO
pipeline_tag: text-generation
---
# Llama-3-Ko-OpenOrca
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
Original model: [beomi/Llama-3-Open-Ko-8B](https://huggingface.co/beomi/Llama-3-Open-Ko-8B)
Dataset: [kyujinpy/OpenOrca-KO](https://huggingface.co/datasets/kyujinpy/OpenOrca-KO)
### Training details
Training: Axolotl을 이용해 LoRA-8bit로 4epoch 학습 시켰습니다.
- sequence_len: 4096
- bf16
학습 시간: A6000x2, 6시간
### License:
[https://llama.meta.com/llama3/license](https://llama.meta.com/llama3/license) |