sometimesanotion's picture
Update README.md
144d3fb verified
|
raw
history blame
1.3 kB
metadata
language:
  - en
license: apache-2.0
library_name: transformers
tags:
  - mergekit
  - merge
base_model:
  - arcee-ai/Virtuoso-Small
pipeline_tag: text-generation

output

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the TIES merge method using sometimesanotion/lamarck-14b-base as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

name:                Lamarck-14B-v0.4-Qwenvergence
merge_method:        ties
base_model:          sometimesanotion/lamarck-14b-base
tokenizer_source:    base
parameters:         
  density:           1.00
  weight:            1.00
  int8_mask:         true
  normalize:         true
  rescale:           false
models:
  - model:           merges/Qwen2.5-14B-Qwenvergence-slerp
    parameters:
      weight:        1.00
      density:       1.00
  - model:           arcee-ai/Virtuoso-Small
    parameters:
      weight:        1.00
      density:       1.00