File size: 977 Bytes
e5b0537
 
 
 
 
 
 
 
 
 
25681ec
e5b0537
 
 
caa61d7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e5b0537
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
---
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- mistral
- trl
- sft
base_model: LeroyDyer/Mixtral_AI_MiniTron
---

these little one are easy to train for task !!! ::

They already have some training (not great)
But they can take more and more

(and being MISTRAL they can takes lora modules!)


Rememeber to add training on to the lora you merge withit : ie load the lora and train a few cycle on the same data that was applied in the p=lora (ie 20 Steps ) and

See it it took hold then merge IT!





# Uploaded  model

- **Developed by:** LeroyDyer
- **License:** apache-2.0
- **Finetuned from model :** LeroyDyer/Mixtral_AI_MiniTron

This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.

[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)