--- base_model: - prithivMLmods/Calme-Ties2-78B - MaziyarPanahi/calme-2.4-rys-78b library_name: transformers tags: - mergekit - merge --- # **Calme-Ties3-78B** This model is a result of merging pre-trained language models using the TIES merge method, with prithivMLmods/Calme-Ties2-78B as the base model. The merged model includes MaziyarPanahi/calme-2.4-rys-78b, with both models contributing equally in terms of weight and density. The configuration features parameters such as normalization, int8 masking, and the use of a bfloat16 data type to enhance performance. # **Merge** This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). # **Merge Method** This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [prithivMLmods/Calme-Ties2-78B](https://huggingface.co/prithivMLmods/Calme-Ties2-78B) as a base. # **Models Merged** The following models were included in the merge: * [MaziyarPanahi/calme-2.4-rys-78b](https://huggingface.co/MaziyarPanahi/calme-2.4-rys-78b) # **Configuration** The following YAML configuration was used to produce this model: ```yaml models: - model: MaziyarPanahi/calme-2.4-rys-78b parameters: weight: 1 density: 1 merge_method: ties base_model: prithivMLmods/Calme-Ties2-78B parameters: weight: 1 density: 1 normalize: true int8_mask: true dtype: bfloat16 ```