Jason Cheng
Update README.md
6a59887 verified
|
raw
history blame
1.69 kB
metadata
base_model:
  - EVA-UNIT-01/EVA-Qwen2.5-7B-v0.1
  - allura-org/Teleut-7b
  - FourOhFour/Vapor_v2_7B
library_name: transformers
tags:
  - mergekit
  - merge

Merge using the brand new NuSLERP method. Fresh out of the oven. Performance not guaranteed.

Uses the slightly-unstable EVA and two other finetunes I found. I also turned on both the NuSLERP exclusive mergekit options for fun.

Named after the nemesia, a temperate shrubby flower. I tried to pick a flower that sounded kind of like NuSLERP. It doesn't, but the name still has the '''essence''' of NuSLERP I guess? (it doesn't.) Very pretty flower nonetheless

mergekit

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the NuSLERP merge method using EVA-UNIT-01/EVA-Qwen2.5-7B-v0.1 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: allura-org/Teleut-7b
    parameters:
      weight: 0.6
  - model: FourOhFour/Vapor_v2_7B
    parameters:
      weight: 0.2
  - model: EVA-UNIT-01/EVA-Qwen2.5-7B-v0.1
    parameters:
      weight: 1.0
merge_method: nuslerp
base_model: EVA-UNIT-01/EVA-Qwen2.5-7B-v0.1
parameters:
  normalize: true
  int8_mask: true
  nuslerp_flatten: false
  nuslerp_row_wise: true
dtype: float16