--- base_model: - athirdpath/Eileithyia-13B - FPHam/Sydney_Overthinker_13b_HF tags: - mergekit - merge license: openrail language: - en library_name: transformers pipeline_tag: text-generation --- ![Intro](screen.png) # merged This model is crazy bad, do not expect anything . This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * [athirdpath/Eileithyia-13B](https://huggingface.co/athirdpath/Eileithyia-13B) * [FPHam/Sydney_Overthinker_13b_HF](https://huggingface.co/FPHam/Sydney_Overthinker_13b_HF) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: athirdpath/Eileithyia-13B dtype: float16 merge_method: slerp parameters: t: - filter: self_attn value: [0.22, 0.61, 0.46, 0.77, 1.0] - filter: mlp value: [0.78, 0.39, 0.54, 0.23, 0.0] - value: 0.5 slices: - sources: - layer_range: [0, 32] model: FPHam/Sydney_Overthinker_13b_HF ```