metadata
base_model: unsloth/llama-3-8b-bnb-4bit
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
Here are the LoRA adapters produced by using my text data with Unsloth's training code, trained for 1000 steps. The base model is the Llama-3-8b-bnb-4bit. If you would like to see the modifications I made to Unsloth's script to make it more concise and adaptable to my own data format, you can find the modified script here
Uploaded model
- Developed by: chrismontes
- License: apache-2.0
- Finetuned from model : unsloth/llama-3-8b-bnb-4bit
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.