Mixtral_Base / README.md
LeroyDyer's picture
Update README.md
6fde52b verified
|
raw
history blame
No virus
1.07 kB
metadata
base_model:
  - mistralai/Mistral-7B-Instruct-v0.2
  - NousResearch/Hermes-2-Pro-Mistral-7B
library_name: transformers
tags:
  - mergekit
  - merge
license: mit
language:
  - en
metrics:
  - accuracy
  - code_eval
  - bleu
  - brier_score

Mixtral_BaseModel -7B-BBase

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the linear merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:


models:
  - model: mistralai/Mistral-7B-Instruct-v0.2
    parameters:
      weight: 1.0
  - model: NousResearch/Hermes-2-Pro-Mistral-7B
    parameters:
      weight: 0.3
merge_method: linear
dtype: float16