Aika-7B / README.md
sethuiyer's picture
Update README.md
fd6b35e verified
|
raw
history blame
1.69 kB
metadata
base_model:
  - SanjiWatsuki/Silicon-Maid-7B
  - Guilherme34/Samantha-v2
  - jan-hq/stealth-v1.3
  - mitultiwari/mistral-7B-instruct-dpo
  - senseable/WestLake-7B-v2
library_name: transformers
tags:
  - mergekit
  - merge

Aika-7B

This is a merge of pre-trained language models created using mergekit.

Aika

This model was merged using the DARE TIES merge method using mitultiwari/mistral-7B-instruct-dpo as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:


base_model: mitultiwari/mistral-7B-instruct-dpo
dtype: bfloat16
merge_method: dare_ties
models:
- model: mitultiwari/mistral-7B-instruct-dpo
- model: senseable/WestLake-7B-v2
  parameters:
    density: 0.75
    weight: 0.30
- model: SanjiWatsuki/Silicon-Maid-7B
  parameters:
    density: 0.75
    weight: 0.30
- model: jan-hq/stealth-v1.3
  parameters:
    density: 0.85
    weight: 0.30
- model: Guilherme34/Samantha-v2
  parameters:
    density: 0.65
    weight: 0.10
parameters:
  int8_mask: true