Model Card for BiMediX-Bilingual

Model Details

  • Name: BiMediX
  • Version: 1.0
  • Type: Bilingual Medical Mixture of Experts Large Language Model (LLM)
  • Languages: Arabic
  • Model Architecture: Mixtral-8x7B-Instruct-v0.1
  • Training Data: BiMed1.3M-Arabic, an arabic dataset with diverse medical interactions.

Intended Use

  • Primary Use: Medical interactions in both English and Arabic.
  • Capabilities: MCQA, closed QA and chats.

Getting Started

from transformers import AutoModelForCausalLM, AutoTokenizer

model_id = "BiMediX/BiMediX-Ara"

tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)

text = "مرحبًا بيميديكس! لقد كنت أعاني من التعب المتزايد في الأسبوع الماضي."
inputs = tokenizer(text, return_tensors="pt")

outputs = model.generate(**inputs, max_new_tokens=500)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Training Procedure

  • Dataset: BiMed1.3M-Arabic.
  • QLoRA Adaptation: Implements a low-rank adaptation technique, incorporating learnable low-rank adapter weights into the experts and the routing network. This results in training about 4% of the original parameters.
  • Training Resources: The model underwent training on the Arabic corpus.

Model Performance

Model CKG CBio CMed MedGen ProMed Ana MedMCQA MedQA PubmedQA AVG
Jais-30B 52.1 50.7 40.5 49.0 39.3 43.0 37.0 28.8 74.6 46.1
BiMediX (Arabic) 60.0 54.9 55.5 58.0 58.1 49.6 46.0 40.2 76.6 55.4
BiMediX (Bilingual) 63.8 57.6 52.6 64.0 52.9 50.4 49.1 47.3 78.4 56.5

Safety and Ethical Considerations

  • Potential issues: hallucinations, toxicity, stereotypes.
  • Usage: Research purposes only.

Accessibility

Authors

Sara Pieri, Sahal Shaji Mullappilly, Fahad Shahbaz Khan, Rao Muhammad Anwer Salman Khan, Timothy Baldwin, Hisham Cholakkal
Mohamed Bin Zayed University of Artificial Intelligence (MBZUAI)

Downloads last month
22
Safetensors
Model size
46.6B params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.