--- base_model: - LeroyDyer/SpydazWeb_AI_HumanAI_012_INSTRUCT_IA - LeroyDyer/SpydazWeb_AI_HumanAI_012_INSTRUCT_XA - LeroyDyer/SpydazWeb_AI_HumanAGI_001_GA - LeroyDyer/LCARS_TOP_SCORE library_name: transformers tags: - mergekit - merge license: apache-2.0 --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [LeroyDyer/SpydazWeb_AI_HumanAI_012_INSTRUCT_IA](https://huggingface.co/LeroyDyer/SpydazWeb_AI_HumanAI_012_INSTRUCT_IA) as a base. ### Models Merged The following models were included in the merge: * [LeroyDyer/SpydazWeb_AI_HumanAI_012_INSTRUCT_XA](https://huggingface.co/LeroyDyer/SpydazWeb_AI_HumanAI_012_INSTRUCT_XA) * [LeroyDyer/SpydazWeb_AI_HumanAGI_001_GA](https://huggingface.co/LeroyDyer/SpydazWeb_AI_HumanAGI_001_GA) * [LeroyDyer/LCARS_TOP_SCORE](https://huggingface.co/LeroyDyer/LCARS_TOP_SCORE) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: LeroyDyer/LCARS_TOP_SCORE parameters: weight: 0.564 - model: LeroyDyer/SpydazWeb_AI_HumanAI_012_INSTRUCT_XA parameters: density: 0.256 weight: [0.128, 0.128, 0.128, 0.128] # weight gradient - model: LeroyDyer/SpydazWeb_AI_HumanAGI_001_GA parameters: density: 0.768 weight: - filter: mlp value: 0.768 - value: 0.512 merge_method: ties base_model: LeroyDyer/SpydazWeb_AI_HumanAI_012_INSTRUCT_IA parameters: normalize: true int8_mask: true dtype: float16 ```