NaruMOE-3x7B-v2-AWQ / README.md
Suparious's picture
Update README.md
9b67400 verified
|
raw
history blame contribute delete
No virus
818 Bytes
metadata
license: cc-by-nc-4.0
base_model:
  - Alsebay/NarumashiRTS-V2
  - SanjiWatsuki/Kunoichi-DPO-v2-7B
  - Nitral-AI/KukulStanta-7B
library_name: transformers
tags:
  - moe
  - merge
  - roleplay
  - Roleplay
  - 4-bit
  - AWQ
  - text-generation
  - autotrain_compatible
  - endpoints_compatible
pipeline_tag: text-generation
inference: false
quantized_by: Suparious

Alsebay/NaruMOE-3x7B-v2 AWQ

Model Summary

A MoE model for Roleplaying. Since 7B model is small enough, we can combine them to a bigger model (Which CAN be smarter).

Adapte (some limited) TSF (Trans Sexual Fiction) content because I have include my pre-train model in.

Worse than V1 in logic, but better in expression.