--- language: - en license: apache-2.0 tags: - text-generation-inference - transformers - unsloth - mistral - trl base_model: unsloth/mistral-7b-bnb-4bit --- # Uploaded model - **Developed by:** priamai - **License:** apache-2.0 - **Finetuned from model :** unsloth/mistral-7b-bnb-4bit This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. Num GPUs = 1 Num Epochs = 2 Batch size per device = 2 Gradient Accumulation steps = 2 Total batch size = 8 otal steps = 2 Number of trainable parameters = 41,943,040 Total Samples = 800 Source of reports: [ORKL](https://orkl.eu/) [](https://github.com/unslothai/unsloth)