LeroyDyer's picture
Update README.md
705f349 verified
metadata
language:
  - en
license: apache-2.0
tags:
  - text-generation-inference
  - transformers
  - unsloth
  - mistral
  - trl
base_model: LeroyDyer/Mixtral_AI_CyberBrain_3.0

interchangable !!!

Uploaded model

Actually my Got to Lora Quite large training cycle so when applying to merged misrals it does realign them . i usually apply this lora to each model premerge , so the outcome will also be merged with this lora , and recieve some traing with dolphin databases ; hopefully still alligned to the dataset... so multi layered dolphin data is injected in to the hosts . to give a woarm start : This is applied to smaller layered models also ! (as it generally has some effect especially on the raw base models .... something is in there working!)

  • Developed by: LeroyDyer
  • License: apache-2.0
  • Finetuned from model : LeroyDyer/Mixtral_AI_CyberBrain_3.0

This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.