File size: 1,626 Bytes
e75a1aa
f2a1cf6
aeb1f6a
e9927aa
e75a1aa
aeb1f6a
e9927aa
aeb1f6a
e9927aa
aeb1f6a
e9927aa
aeb1f6a
e9927aa
aeb1f6a
e9927aa
aeb1f6a
f2a1cf6
 
 
 
 
aeb1f6a
 
 
 
 
 
 
 
 
 
f2a1cf6
 
6f91996
f2a1cf6
e9927aa
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
---
base_model: []
tags:
- conversation
---

![image/png](https://cdn-uploads.huggingface.co/production/uploads/64bb1109aaccfd28b023bcec/4UliqjJuW0cn9KlrqfJLL.png)

### Design

This mistral 7b model is a task arithmetic merge of Epiculous/Fett-uccine-7B (theory of mind and gnosis datasets), GRMenon/mental-mistral-7b-instruct-autotrain (mental health counseling conversations dataset), and teknium/Hermes-Trismegistus-Mistral-7B (open-hermes + occult datasets)

The design intention is to create a pseudo-philosophical, pseudo-spiritual, pseudo counseling chatbob model for sounding ideas off. Like a mirror really. This obviously does not constitute medical advice, and if you are in need seek professional help. The name Apocrypha-7B comes from the fact that it's fake - this isn't a guide or a guru. It's at best, if the model works, a sounding board. But I think such things might still be helpful for organising ones own thoughts.

I will throw a GGUF or two inside a subfolder here.  

### Configuration

The following YAML configuration was used to produce this model:

```yaml
models:
  - model: ./Hermes-Trismegistus-7B
    parameters:
      weight: 0.35
  - model: ./mental-mistral-7b
    parameters:
      weight: 0.39
  - model: ./Fett-uccine-7B
    parameters:
      weight: 0.45
merge_method: task_arithmetic
base_model: ./Mistral-7B-v0.1
dtype: bfloat16
```

Resources used:

https://huggingface.co/teknium/Hermes-Trismegistus-Mistral-7B

https://huggingface.co/GRMenon/mental-mistral-7b-instruct-autotrain

https://huggingface.co/Epiculous/Fett-uccine-7B/tree/main

https://github.com/cg123/mergekit/tree/main