Fischerboot's picture
Update README.md
b3c3ce5 verified
|
raw
history blame
1.04 kB
metadata
base_model:
  - Fischerboot/InternLM2-ToxicRP-QLORA-4Bit
library_name: transformers
tags:
  - mergekit
  - merge

InternLM2-chat-20B-ToxicRP-QLORA-Merged

This Model was Finetuned by me, using the Machine Power of g4rg. Big Thanks to all people that helped me. Do whatever you want with this Model, just do do anything illegal. as we say in Germany Kuss Kuss Kuss

non Quantized here: Fischerboot/InternLM2-Chat-20B-ToxicRP-QLORA-Merged have fun

Merge Method

This model was merged using the passthrough merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

dtype: bfloat16
merge_method: passthrough
slices:
- sources:
  - layer_range: [0, 48]
    model: output/intervitens_internlm2-limarp-chat-20b-2+Fischerboot/InternLM2-ToxicRP-QLORA-4Bit