File size: 1,091 Bytes
508c4ee
e6c75ff
 
 
 
 
 
 
508c4ee
e76a41e
e6c75ff
d57784c
c3b6b10
 
 
 
 
d57784c
c3b6b10
e6c75ff
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
---
base_model:
- Fischerboot/InternLM2-ToxicRP-QLORA-4Bit
library_name: transformers
tags:
- mergekit
- merge

---
# InternLM2-chat-20B-ToxicRP-QLORA-Merged

This Model was Finetuned by me, using the Machine Power of g4rg.
Big Thanks to all people that helped me.
Do whatever you want with this Model, just do do anything illegal.
as we say in Germany
Kuss Kuss Kuss

GGUF here: Fischerboot/InternLM2-Chat-20B-ToxicRP-QLORA-Merged-GGUF
more quants will follow once my internet is allowing do upload em

### Merge Method

This model was merged using the passthrough merge method.

### Models Merged

The following models were included in the merge:
* output/intervitens_internlm2-limarp-chat-20b-2 + [Fischerboot/InternLM2-ToxicRP-QLORA-4Bit](https://huggingface.co/Fischerboot/InternLM2-ToxicRP-QLORA-4Bit)

### Configuration

The following YAML configuration was used to produce this model:

```yaml
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
  - layer_range: [0, 48]
    model: output/intervitens_internlm2-limarp-chat-20b-2+Fischerboot/InternLM2-ToxicRP-QLORA-4Bit
```