metadata
base_model: unsloth/qwen2.5-7b-instruct-unsloth-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- qwen2
- trl
license: apache-2.0
language:
- en
datasets:
- mlfoundations-dev/s1K-with-deepseek-r1-sharegpt
- Nitral-AI/Cosmopedia-Instruct-60k-Distilled-R1-70B-ShareGPT
dataset = load_dataset("mlfoundations-dev/s1K-with-deepseek-r1-sharegpt", split = "train")
dataset2 = load_dataset("Nitral-AI/Cosmopedia-Instruct-60k-Distilled-R1-70B-ShareGPT", split = "train")
Uploaded model
- Developed by: bunnycore
- License: apache-2.0
- Finetuned from model : unsloth/qwen2.5-7b-instruct-unsloth-bnb-4bit
This qwen2 model was trained 2x faster with Unsloth and Huggingface's TRL library.