--- base_model: - Nohobby/L3.3-Prikol-70B-v0.1a - Sao10K/70B-L3.3-Cirrus-x1 library_name: transformers tags: - mergekit - merge --- # prikol This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using AbominationSnowPig as a base. ### Models Merged The following models were included in the merge: * [Nohobby/L3.3-Prikol-70B-v0.1a](https://huggingface.co/Nohobby/L3.3-Prikol-70B-v0.1a) * [Sao10K/70B-L3.3-Cirrus-x1](https://huggingface.co/Sao10K/70B-L3.3-Cirrus-x1) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: AbominationSnowPig merge_method: model_stock dtype: bfloat16 models: - model: Sao10K/70B-L3.3-Cirrus-x1 - model: Nohobby/L3.3-Prikol-70B-v0.1a ```