Edit model card

Eclectic-Maid-10B

This is a merge of pre-trained language models created using mergekit.

Merge Details

A reborn Maid model with many models and several merges. Seems to work well with Alpac Roleplay, ChatML, noromaid settings in Silly Tavern

Merge Method

This model was merged using the DARE TIES merge method using Eclectic-Maid-10B-p1 as a base.

Models Merged

Many

Configuration

The following YAML configuration was used to produce this model:

models: 
  - model: Eclectic-Maid-10B-p1
    # no parameters necessary for base model
  - model: Eclectic-Maid-10B-p2
    parameters:
      weight: 0.50
      density: 0.5
  - model: Eclectic-Maid-10B-p2
    parameters:
      weight: 0.50
      density: 0.5
merge_method: dare_ties
base_model: Eclectic-Maid-10B-p1
parameters:
  int8_mask: true
dtype: bfloat16
name: Eclectic Maid
name: Maid-Reborn
models: 
  - model: mistralai/Mistral-7B-Instruct-v0.2
    # no parameters necessary for base model
  - model: xDAN-AI/xDAN-L1-Chat-RL-v1
    parameters:
      weight: 0.4
      density: 0.8
  - model: Undi95/BigL-7B
    parameters:
      weight: 0.3
      density: 0.8
  - model: SanjiWatsuki/Loyal-Toppy-Bruins-Maid-7B-DARE
    parameters:
      weight: 0.2
      density: 0.4
  - model: NeverSleep/Noromaid-7B-0.4-DPO
    parameters:
      weight: 0.2
      density: 0.4
  - model: NSFW_DPO_Noromaid-7B-v2
    parameters:
      weight: 0.2
      density: 0.4
merge_method: dare_ties
base_model: mistralai/Mistral-7B-Instruct-v0.2
parameters:
  int8_mask: true
dtype: bfloat16
models: 
  - model: Maid-Reborn-7B
    # no parameters necessary for base model
  - model: localfultonextractor/Erosumika-7B-v2
    parameters:
      weight: 0.4
      density: 0.5
  - model: Undi95/BigL-7B
    parameters:
      weight: 0.3
      density: 0.5
  - model: Maid-Reborn-7B
    parameters:
      weight: 0.2
      density: 0.4
  - model: openchat_openchat-3.5-1210
    parameters:
      weight: 0.2
      density: 0.3
  - model: Nexusflow/Starling-LM-7B-beta
    parameters:
      weight: 0.2
      density: 0.3
merge_method: dare_ties
base_model: Maid-Reborn-7B
parameters:
  int8_mask: true
dtype: bfloat16
models:
  - model: Maid-Reborn-10B-p1
    # no parameters necessary for base model
  - model: Maid-Reborn-10B-p2
    parameters:
      weight: 0.30
      density: 0.5
  - model: Maid-Reborn-10B-p2
    parameters:
      weight: 0.30
      density: 0.5
merge_method: dare_ties
base_model: Maid-Reborn-10B-p1
parameters:
  int8_mask: true
dtype: bfloat16
name: Eclectic Maid
models: 
  - model: Maid-Reborn-10B-p2
    # no parameters necessary for base model
  - model: Maid-Reborn-10B-p1
    parameters:
      weight: 0.30
      density: 0.5
  - model: Maid-Reborn-10B-p1
    parameters:
      weight: 0.30
      density: 0.5
merge_method: dare_ties
base_model: Maid-Reborn-10B-p2
parameters:
  int8_mask: true
dtype: bfloat16
name: Eclectic Maid
Downloads last month
4
Safetensors
Model size
10.5B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ND911/Eclectic-Maid-10B

Quantizations
1 model