metadata
base_model:
- prithivMLmods/Phi-4-Math-IO
- ngxson/LoRA-phi-4-abliterated
- prithivMLmods/Phi-4-o1
- ngxson/LoRA-phi-4-abliterated
- prithivMLmods/Phi-4-QwQ
- ngxson/LoRA-phi-4-abliterated
- prithivMLmods/Phi-4-Empathetic
- ngxson/LoRA-phi-4-abliterated
library_name: transformers
tags:
- mergekit
- merge
merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the TIES merge method using prithivMLmods/Phi-4-Empathetic + ngxson/LoRA-phi-4-abliterated as a base.
Models Merged
The following models were included in the merge:
- prithivMLmods/Phi-4-Math-IO + ngxson/LoRA-phi-4-abliterated
- prithivMLmods/Phi-4-o1 + ngxson/LoRA-phi-4-abliterated
- prithivMLmods/Phi-4-QwQ + ngxson/LoRA-phi-4-abliterated
Configuration
The following YAML configuration was used to produce this model:
models:
- model: prithivMLmods/Phi-4-Empathetic+ngxson/LoRA-phi-4-abliterated
- model: prithivMLmods/Phi-4-o1+ngxson/LoRA-phi-4-abliterated
parameters:
density: 0.75
weight: 0.75
- model: prithivMLmods/Phi-4-QwQ+ngxson/LoRA-phi-4-abliterated
parameters:
density: 0.50
weight: 0.50
- model: prithivMLmods/Phi-4-Math-IO+ngxson/LoRA-phi-4-abliterated
parameters:
density: 0.30
weight: 0.30
merge_method: ties
base_model: prithivMLmods/Phi-4-Empathetic+ngxson/LoRA-phi-4-abliterated
parameters:
normalize: true
dytpe: float32