File size: 2,243 Bytes
09e856b
 
 
 
 
 
 
 
 
 
 
1565998
3828ca9
09e856b
80e64ed
 
1db78a6
 
533f94c
09e856b
533f94c
 
 
 
 
1565998
09e856b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3828ca9
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
---
base_model:
- Hastagaras/Halu-8B-Llama3-v0.3
- Blackroot/Llama-3-LongStory-LORA
- Hastagaras/Halu-8B-Llama3-v0.3
- Blackroot/Llama-3-8B-Abomination-LORA
- Hastagaras/Halu-8B-Llama3-v0.3
library_name: transformers
tags:
- mergekit
- merge
- not-for-all-audiences
license: llama3
---
**VERY IMPORTANT:** This model has not been extensively tested or evaluated, and its performance characteristics are currently unknown. It may generate harmful, biased, or inappropriate content. Please exercise caution and use it at your own risk and discretion.

I just tried [saishf's](https://huggingface.co/saishf) merged model, and it's great. So I decided to try a similar merge method with [Blackroot's](https://huggingface.co/Blackroot) LoRA that I had found earlier.

I don't know what to say about this model... this model is very strange...Maybe because Blackroot's amazing Loras used human data and not synthetic data, hence the model turned out to be very human-like...even the actions or narrations.

**WARNING:** This model is very unsafe in certain parts...especially in RP, proof:

<div align="left">
  <img src="https://huggingface.co/Hastagaras/Halu-8B-Llama3-Blackroot/resolve/main/chrome_hszhwFLJ6i.jpg" width="1000"/>
</div>

### Merge Method

This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [Hastagaras/Halu-8B-Llama3-v0.3](https://huggingface.co/Hastagaras/Halu-8B-Llama3-v0.3) as a base.

### Models Merged

The following models were included in the merge:
* [Hastagaras/Halu-8B-Llama3-v0.3](https://huggingface.co/Hastagaras/Halu-8B-Llama3-v0.3) + [Blackroot/Llama-3-LongStory-LORA](https://huggingface.co/Blackroot/Llama-3-LongStory-LORA)
* [Hastagaras/Halu-8B-Llama3-v0.3](https://huggingface.co/Hastagaras/Halu-8B-Llama3-v0.3) + [Blackroot/Llama-3-8B-Abomination-LORA](https://huggingface.co/Blackroot/Llama-3-8B-Abomination-LORA)

### Configuration

The following YAML configuration was used to produce this model:

```yaml
models:
  - model: Hastagaras/Halu-8B-Llama3-v0.3+Blackroot/Llama-3-LongStory-LORA
  - model: Hastagaras/Halu-8B-Llama3-v0.3+Blackroot/Llama-3-8B-Abomination-LORA
merge_method: model_stock
base_model: Hastagaras/Halu-8B-Llama3-v0.3
dtype: bfloat16

```