Memgpt-3x7b-MOE-AWQ / README.md
Suparious's picture
Updated and moved existing to merged_models base_model tag in README.md
e86adfb verified
metadata
base_model: liminerity/Memgpt-3x7b-MOE
inference: false
language:
  - en
library_name: transformers
license: apache-2.0
merged_models:
  - starsnatched/MemGPT-DPO
  - starsnatched/MemGPT-3
  - starsnatched/MemGPT
pipeline_tag: text-generation
quantized_by: Suparious
tags:
  - 4-bit
  - AWQ
  - text-generation
  - autotrain_compatible
  - endpoints_compatible
  - safetensors
  - moe
  - frankenmoe
  - merge
  - mergekit
  - lazymergekit
  - starsnatched/MemGPT-DPO
  - starsnatched/MemGPT-3
  - starsnatched/MemGPT

liminerity/Memgpt-3x7b-MOE AWQ

Model Summary

Memgpt-3x7b-MOE is a Mixure of Experts (MoE) made with the following models using LazyMergekit: