MoNeuTrix-MoE-4x7B / README.md
CultriX's picture
Upload folder using huggingface_hub
7cecd89 verified
metadata
license: apache-2.0
tags:
  - moe
  - frankenmoe
  - merge
  - mergekit
  - lazymergekit
  - CultriX/MonaTrix-v4
  - mlabonne/OmniTruthyBeagle-7B-v0
  - CultriX/MoNeuTrix-7B-v1
  - paulml/OmniBeagleSquaredMBX-v3-7B
base_model:
  - CultriX/MonaTrix-v4
  - mlabonne/OmniTruthyBeagle-7B-v0
  - CultriX/MoNeuTrix-7B-v1
  - paulml/OmniBeagleSquaredMBX-v3-7B

MoNeuTrix-MoE-4x7B

MoNeuTrix-MoE-4x7B is a Mixture of Experts (MoE) made with the following models using LazyMergekit:

🧩 Configuration

  base_model: "CultriX/MonaTrix-v4"
  dtype: bfloat16
  gate:
    type: "learned"
    temperature: 0.1
    scaling_factor: 10
  experts:
    - source_model: "CultriX/MonaTrix-v4"  # Historical Analysis, Geopolitics, and Economic Evaluation
      positive_prompts:
        - "Historical Analysis"
        - "Geopolitical Evaluation"
        - "Economic Insights"
        - "Policy Analysis"
        - "Socio-Economic Impacts"
        - "Geopolitical Analysis"
        - "Cultural Commentary"
        - "Analyze geopolitical"
        - "Analyze historic"
        - "Analyze historical"
        - "Assess the political dynamics of the Cold War and its global impact."
        - "Evaluate the historical significance of the Silk Road in ancient trade."
      negative_prompts:
        - "Technical Writing"
        - "Mathematical Problem Solving"
        - "Software Development"
        - "Artistic Creation"
        - "Machine Learning Development"
        - "Storywriting"
        - "Character Development"
        - "Roleplaying"
        - "Narrative Creation"

    - source_model: "mlabonne/OmniTruthyBeagle-7B-v0"  # Multilingual Communication and Cultural Insights
      positive_prompts:
        - "Multilingual Communication"
        - "Cultural Insights"
        - "Translation and Interpretation"
        - "Cultural Norms Exploration"
        - "Intercultural Communication Practices"
        - "Describe cultural significance"
        - "Narrate cultural"
        - "Discuss cultural impact"
      negative_prompts:
        - "Scientific Analysis"
        - "Creative Writing"
        - "Technical Documentation"
        - "Economic Modeling"
        - "Historical Documentation"
        - "Programming"
        - "Algorithm Development"

    - source_model: "CultriX/MoNeuTrix-7B-v1"  # Creative Problem Solving and Innovation
      positive_prompts:
        - "Innovation and Design"
        - "Problem Solving"
        - "Creative Thinking"
        - "Strategic Planning"
        - "Conceptual Design"
        - "Innovation and Design"
        - "Problem Solving"
        - "Compose narrative content or poetry."
        - "Create complex puzzles and games."
        - "Devise strategy"
      negative_prompts:
        - "Historical Analysis"
        - "Linguistic Translation"
        - "Economic Forecasting"
        - "Geopolitical Analysis"
        - "Cultural Commentary"
        - "Historical Documentation"
        - "Scientific Explanation"
        - "Data Analysis Techniques"

    - source_model: "paulml/OmniBeagleSquaredMBX-v3-7B"  # Scientific and Technical Expertise
      positive_prompts:
        - "Scientific Explanation"
        - "Technical Analysis"
        - "Experimental Design"
        - "Data Analysis Techniques"
        - "Scientific Innovation"        
        - "Mathematical Problem Solving"
        - "Algorithm Development"
        - "Programming"
        - "Analyze data"
        - "Analyze statistical data on climate change trends."
        - "Conduct basic data analysis or statistical evaluations."
      negative_prompts:
        - "Cultural Analysis"
        - "Creative Arts"
        - "Linguistic Challenges"
        - "Political Debating"
        - "Marketing Strategies"
        - "Storywriting"
        - "Character Development"
        - "Roleplaying"
        - "Narrative Creation"

💻 Usage

!pip install -qU transformers bitsandbytes accelerate

from transformers import AutoTokenizer
import transformers
import torch

model = "CultriX/MoNeuTrix-MoE-4x7B"

tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True},
)

messages = [{"role": "user", "content": "Explain what a Mixture of Experts is in less than 100 words."}]
prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])