--- language: - en license: apache-2.0 tags: - text-generation-inference - transformers - unsloth - mistral - trl base_model: LeroyDyer/Mixtral_AI_MiniTron_SFT --- This model is under training ! requires data !! : and time ! (day by day will add 1hr of dolphin Coder/White rabbit (generally and orca/dolphin ) until the datasets are fully overfit! before applying other datasets) Getting better ! under training ( the base model become the updated version , this model is always in training and changing if a problem happens he can revert to his previous base self: ) # Uploaded model - **Developed by:** LeroyDyer - **License:** apache-2.0 - **Finetuned from model :** Mixtral_AI_MiniTron This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [](https://github.com/unslothai/unsloth)