Edit model card

Mediquad-20B

Mediquad-20B is a Mixure of Experts (MoE) made with the following models:

Evaluations

Benchmark Mediquad-4x7b meditron-7b Orca-2-7b meditron-70b
MedMCQA
ClosedPubMedQA
PubMedQA
MedQA
MedQA4
MedicationQA
MMLU Medical
TruthfulQA
GSM8K
ARC
HellaSwag
Winogrande

🧩 Configuration

gate_mode: hidden
dtype: bfloat16
experts:
  - source_model: epfl-llm/meditron-7b
    positive_prompts:
      - "How does sleep affect cardiovascular health?"
      - "When discussing diabetes management, the key factors to consider are"
      - "The differential diagnosis for a headache with visual aura could include"
    negative_prompts:
      - "What are the environmental impacts of deforestation?"
      - "The recent advancements in artificial intelligence have led to developments in"
  - source_model: chaoyi-wu/PMC_LLAMA_7B_10_epoch
    positive_prompts:
      - "How would you explain the importance of hypertension management to a patient?"
      - "Describe the recovery process after knee replacement surgery in layman's terms."
    negative_prompts:
      - "Recommend a good recipe for a vegetarian lasagna."
      - "The recent advancements in artificial intelligence have led to developments in"
      - "The fundamental concepts in economics include ideas like supply and demand, which explain"
  - source_model: allenai/tulu-2-dpo-7b
    positive_prompts:
      - "Here is a funny joke for you -"
      - "When considering the ethical implications of artificial intelligence, one must take into account"
      - "In strategic planning, a company must analyze its strengths and weaknesses, which involves"
      - "Understanding consumer behavior in marketing requires considering factors like"
      - "The debate on climate change solutions hinges on arguments that"
    negative_prompts:
      - "In discussing dietary adjustments for managing hypertension, it's crucial to emphasize"
      - "For early detection of melanoma, dermatologists recommend that patients regularly check their skin for"
      - "Explaining the importance of vaccination, a healthcare professional should highlight"
  - source_model: microsoft/Orca-2-7b
    positive_prompts:
      - "Given the riddle above,"
      - "Given the above context deduce the outcome:"
      - "The logical flaw in the above paragraph is"
    negative_prompts:
      - "In discussing dietary adjustments for managing hypertension, it's crucial to emphasize"
      - "For early detection of melanoma, dermatologists recommend that patients regularly check their skin for"
      - "Explaining the importance of vaccination, a healthcare professional should highlight"

πŸ’» Usage

!pip install -qU transformers bitsandbytes accelerate

from transformers import AutoTokenizer
import transformers
import torch

model = "Technoculture/Mediquad-20B"

tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True},
)

messages = [{"role": "user", "content": "Explain what a Mixture of Experts is in less than 100 words."}]
prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
Downloads last month
1,356
Safetensors
Model size
19.7B params
Tensor type
BF16
Β·
Inference API
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated) instead.

Collection including Technoculture/Mediquad-4x7b