--- base_model: - 152334H/miqu-1-70b-sf - stabilityai/japanese-stablelm-instruct-beta-70b library_name: transformers tags: - mergekit - merge - failure - fail - merge fail --- ## Warning: This model seems like a failure unfortnately, extremely disapointing proformance. It's completely useless, generates utter nonsense, worse than either base model. Will not be publishning GGUFs as a reult might investigate why it's bad. Hope to have a good Japanese open source LLM one day but this was a complete waste of my day 🙏 ## Deepl: 警告: このモデルは残念ながら失敗作のようだ。全く役に立たず、全くナンセンスを生み出し、どちらのベースモデルよりも悪い。結果としてGGUFを公開することはないだろうが、なぜ悪いのかを調査するかもしれない。いつか良い日本のオープンソースのLLMができることを願っているが、これは完全に私の一日の無駄だった🙏。 ## Examples of it's brain damage: ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6342619a9948f573f37a4a60/IAOVk0Abm9POsigbkzhOY.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6342619a9948f573f37a4a60/GVB_QjQtLzccAzZLYqO3w.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6342619a9948f573f37a4a60/KbFcu22kLju37u2GwwX8m.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6342619a9948f573f37a4a60/pC2MDn770UEePMooiR2pn.png) # output This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [task arithmetic](https://arxiv.org/abs/2212.04089) merge method using [152334H/miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) as a base. ### Models Merged The following models were included in the merge: * [stabilityai/japanese-stablelm-instruct-beta-70b](https://huggingface.co/stabilityai/japanese-stablelm-instruct-beta-70b) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: 152334H/miqu-1-70b-sf models: - model: stabilityai/japanese-stablelm-instruct-beta-70b parameters: weight: 0.5 merge_method: task_arithmetic parameters: weight: 0.25 dtype: float16 random_seed: 694201337567099116663322537 ```