|
--- |
|
base_model: [] |
|
tags: |
|
- conversation |
|
license: apache-2.0 |
|
--- |
|
|
|
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64bb1109aaccfd28b023bcec/4UliqjJuW0cn9KlrqfJLL.png) |
|
|
|
### Design |
|
|
|
This mistral 7b model is a task arithmetic merge of Epiculous/Fett-uccine-7B (theory of mind and gnosis datasets), GRMenon/mental-mistral-7b-instruct-autotrain (mental health counseling conversations dataset), and teknium/Hermes-Trismegistus-Mistral-7B (open-hermes + occult datasets) |
|
|
|
The design intention is to create a pseudo-philosophical, pseudo-spiritual, pseudo counseling chatbob model for sounding ideas off. Like a mirror really. This obviously does not constitute medical advice, and if you are in need seek professional help. The name Apocrypha-7B comes from the fact that it's fake - this isn't a guide or a guru. It's at best, if the model works, a sounding board. But I think such things might still be helpful for organising ones own thoughts. |
|
|
|
I will throw a GGUF or two inside a subfolder here. |
|
|
|
### Configuration |
|
|
|
The following YAML configuration was used to produce this model: |
|
|
|
```yaml |
|
models: |
|
- model: ./Hermes-Trismegistus-7B |
|
parameters: |
|
weight: 0.35 |
|
- model: ./mental-mistral-7b |
|
parameters: |
|
weight: 0.39 |
|
- model: ./Fett-uccine-7B |
|
parameters: |
|
weight: 0.45 |
|
merge_method: task_arithmetic |
|
base_model: ./Mistral-7B-v0.1 |
|
dtype: bfloat16 |
|
``` |
|
|
|
Resources used: |
|
|
|
https://huggingface.co/teknium/Hermes-Trismegistus-Mistral-7B |
|
|
|
https://huggingface.co/GRMenon/mental-mistral-7b-instruct-autotrain |
|
|
|
https://huggingface.co/Epiculous/Fett-uccine-7B/tree/main |
|
|
|
https://github.com/cg123/mergekit/tree/main |