LoneStriker's picture
Upload folder using huggingface_hub
607bac5 verified
|
raw
history blame
4.16 kB
metadata
license: apache-2.0
tags:
  - moe
  - merge
  - abideen/NexoNimbus-7B
  - mlabonne/NeuralMarcoro14-7B
language:
  - en
library_name: transformers

NexoNimbus-MoE-2x7B

image/png

NexoNimbus-MoE-2x7B is a Mixure of Experts (MoE) made with the following models:

🏆 Evaluation NexoNimbus-MoE-2x7B is the 10th best-performing 13B LLM on the Open LLM Leaderboard:

image/png

Task Version Metric Value Stderr
arc_challenge 0 acc 62.28 ± 1.41
acc_norm 66.80 ± 1.37
hellaswag 0 acc 66.83 ± 0.46
acc_norm 85.66 ± 0.34
gsm8k 0 acc 53.52 ± 1.37
winogrande 0 acc 81.53 ± 1.09
mmlu 0 acc 64.51 ± 1.00

Average: 67.51%

TruthfulQA

Task Version Metric Value Stderr
truthfulqa_mc 1 mc1 35.98 ± 1.68
mc2 53.05 ± 1.53

🧩 Configuration

base_model: teknium/OpenHermes-2.5-Mistral-7B
gate_mode: hidden
dtype: bfloat16
experts:
  - source_model: abideen/NexoNimbus-7B
    positive_prompts:
    - "Mathematics"
    - "Physics"
    - "Chemistry"
    - "Biology"
    - "Medicine"
    - "Engineering"
    - "Computer Science"

    negative_prompts:
    - "History"
    - "Philosophy"
    - "Linguistics"
    - "Literature"
    - "Art and Art History"
    - "Music Theory and Composition"
    - "Performing Arts (Theater, Dance)"

  - source_model: mlabonne/NeuralMarcoro14-7B 
    positive_prompts:
    - "Earth Sciences (Geology, Meteorology, Oceanography)"
    - "Environmental Science"
    - "Astronomy and Space Science"
    - "Psychology"
    - "Sociology"
    - "Anthropology"
    - "Political Science"
    - "Economics"
    negative_prompts:
    - "Education"
    - "Law"
    - "Theology and Religious Studies"
    - "Communication Studies"
    - "Business and Management"
    - "Agricultural Sciences"
    - "Nutrition and Food Science"
    - "Sports Science"

💻 Usage

Here's a Colab notebook to run NexoNimbus-MoE-2x7B in 4-bit precision on a free T4 GPU.

!pip install -qU transformers bitsandbytes accelerate

from transformers import AutoTokenizer
import transformers
import torch

model = "abideen/NexoNimbus-MoE-2x7B"

tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True},
)

messages = [{"role": "user", "content": "Explain what is data science."}]
prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])

"Data science is an interdisciplinary field that combines mathematics, statistics, computer science, and domain expertise in order to extract meaningful insights and knowledge from structured and unstructured data. It involves the process of collecting, cleaning, transforming, analyzing, and visualizing data in order to identify patterns, trends, and relationships that can inform decision-making and drive business strategies. Data scientists use various tools and techniques, such as machine learning, deep learning, and natural language processing, to develop predictive models, optimize processes, and automate decision-making. The field of data science is rapidly evolving as more and more data is generated and the demand for data-driven insights continues to grow."