Japanese-Migu-70b / README.md
nonetrix's picture
Update README.md
eb6e0a1 verified
metadata
base_model:
  - 152334H/miqu-1-70b-sf
  - stabilityai/japanese-stablelm-instruct-beta-70b
library_name: transformers
tags:
  - mergekit
  - merge
  - failure
  - fail
  - merge fail

Warning: This model seems like a failure unfortnately, extremely disapointing proformance. It's completely useless, generates utter nonsense, worse than either base model. Will not be publishning GGUFs as a reult might investigate why it's bad. Hope to have a good Japanese open source LLM one day but this was a complete waste of my day 🙏

Deepl: 警告: このモデルは残念ながら失敗作のようだ。全く役に立たず、全くナンセンスを生み出し、どちらのベースモデルよりも悪い。結果としてGGUFを公開することはないだろうが、なぜ悪いのかを調査するかもしれない。いつか良い日本のオープンソースのLLMができることを願っているが、これは完全に私の一日の無駄だった🙏。

Examples of it's brain damage:

image/png image/png image/png

image/png

output

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the task arithmetic merge method using 152334H/miqu-1-70b-sf as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model: 152334H/miqu-1-70b-sf
models:
      - model: stabilityai/japanese-stablelm-instruct-beta-70b
        parameters:
          weight: 0.5
merge_method: task_arithmetic
parameters:
      weight: 0.25
dtype: float16
random_seed: 694201337567099116663322537