Edit model card

Genetic Merge : The merging of the previous models : the artform -

Genetic Merges : First to merge genetically are: (functipnality and Personality) Y-Gene and an X-Gene are created : these are created using the ties merge methods : enabling for the retention of skills into a base model: the samebase model is chosen for both genes : which will be merged to form the final model :

here we have chosen

** LeroyDyer/Mixtral_AI_DeepMind

** LeroyDyer/Mixtral_AI_CyberUltron_DPO

** LeroyDyer/Mixtral_AI_DeepMedicalMind

** LeroyDyer/Mixtral_AI_Samantha

these form the begining of the mind models which have been trained to perform varous funcitons using by generating agents : as well as display the thoughts of the model or events leading to the discoveruy pf the answer: the models which are also for role play have been included , these model were trained to be counsellers and personal friends , even virtual lovers: the ability to have some form of character has eben embeded as well as question asking, ie asking the user about themself or hobbies or even just ot be a chat friend: and say hello etc ... hence these model are a vital peice to be handed down to other models:

The X-Gene : (medical tasks / agent management) LeroyDyer/Mixtral_AI_Chat_2.0

LeroyDyer/Mixtral_BioMedical

LeroyDyer/Mixtral_AI_Medic

LeroyDyer/Mixtral_Cyber_BioMedic

LeroyDyer/Mixtral_AI_DeepMedicalMind

these form the role of medical helper; many medical datsets have been installed with question answering and , entity recognition and sources of information: regarding important topics and information also various agents have been install inside the model to allow the model to generate specialist assistants to perfrom medical roles:

The varient : competition benchmarker: The varient retains the previous training from the competition benchmarking datasets : we are not sure how much is lost or gained by these datasets ; or by mergeing how mush is missing from the model, so this varient is created to allow for the mass taiing to find some balance uin the model; again by merging this model : we can use the trainig datsets from these model to realing the merged model:

the final creating : The model :

by merging, with a liner merge these models will find an average model ; of the three models; hence there is no base model .... an new model with new characteristics will be born!

the x-gene :

the y-gene :

varient model :

we can have a frakenstien of a model which thoretically should contain the submerged charactweristics of all models:

but still these will be re-aligned with thier respective datsets perhaps 20-30 steps max !

the graphs were 2982 : hence very dense merges :

this will be the next step in the evolutionary process for the deep mind series :

in this merge it was decided NOT to use Mixtral_AI_CyberUltron_Ultra .... as these models were created from the cyberton_ultra base.... a fully trained knowledge base and instruct model! it is a very fast learner and a great point ot begin any project!! when training there is no requirement to use many epoch as it accepts the new task quickly the only worry is over fitting the new task as the model coverges super quicky :

this model will be expected to be a high performer , with many possible output types and formats:

I have tried to format the output as much as possible with various extra types of information ; such as word associations and entity lists ... ast tress wetc s

For our Y_Gene -

Downloads last month
6
Safetensors
Model size
7.24B params
Tensor type
FP16
·

Finetuned from

Collections including LeroyDyer/Mixtral_AI_CyberTron_DeepMind