File size: 356 Bytes
2eff43f 4ad32e7 2eff43f ed327f8 4ad32e7 |
1 2 3 4 5 6 7 8 9 |
---
datasets:
- Yukang/LongAlpaca-16k-length
---
This repo is a clone of [mattshumer/Llama-3-8B-16K](https://huggingface.co/mattshumer/Llama-3-8B-16K)
This is an extended (16K) context version of LLaMA 3. Trained for five hours on 8x A6000 GPUs, using the `Yukang/LongAlpaca-16k-length` dataset.
`rope_theta` was set to `1000000.0`. Trained with Axolotl. |