Edit model card

output_mega2

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using /notebooks/dippy-bittensor-subnet/clone_hgnoi_9QeCVLNmTBXdF6id as a base.

Models Merged

The following models were included in the merge:

  • /notebooks/dippy-bittensor-subnet/clone_fifala_12-00
  • /notebooks/dippy-bittensor-subnet/clone_hgnoi_LSliYB81E6E3D7jM
  • /notebooks/dippy-bittensor-subnet/clone_hgnoi_4BXEPZSqciDBKlYW
  • /notebooks/dippy-bittensor-subnet/clone_starnet_07-00
  • /notebooks/dippy-bittensor-subnet/clone_fifala_07-00
  • /notebooks/dippy-bittensor-subnet/clone_giantdev_dippy-z035H-sn11m9

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: /notebooks/dippy-bittensor-subnet/clone_hgnoi_9QeCVLNmTBXdF6id
    # no parameters necessary for base model
  - model: /notebooks/dippy-bittensor-subnet/clone_starnet_07-00
    parameters:
      weight: 0.19
      density: 0.83
  - model: /notebooks/dippy-bittensor-subnet/clone_fifala_07-00
    parameters:
      weight: 0.14
      density: 0.6
  - model: /notebooks/dippy-bittensor-subnet/clone_fifala_12-00
    parameters:
      weight: 0.19
      density: 0.83
  - model: /notebooks/dippy-bittensor-subnet/clone_hgnoi_LSliYB81E6E3D7jM
    parameters:
      weight: 0.14
      density: 0.6
  - model: /notebooks/dippy-bittensor-subnet/clone_giantdev_dippy-z035H-sn11m9
    parameters:
      weight: 0.19
      density: 0.83
  - model: /notebooks/dippy-bittensor-subnet/clone_hgnoi_4BXEPZSqciDBKlYW
    parameters:
      weight: 0.15
      density: 0.08
merge_method: dare_ties
base_model: /notebooks/dippy-bittensor-subnet/clone_hgnoi_9QeCVLNmTBXdF6id
parameters:
  int8_mask: true
dtype: bfloat16
Downloads last month
7
Safetensors
Model size
1.64B params
Tensor type
BF16
·
Invalid base_model specified in model card metadata. Needs to be a model id from hf.co/models.