--- license: cc-by-nc-nd-4.0 language: - en library_name: transformers tags: - reward model - RLHF - medical --- # JSL-MedMNX-7B-SFT [](http://www.johnsnowlabs.com) JSL-MedMNX-7B-SFT is a 7 Billion parameter model developed by [John Snow Labs](https://www.johnsnowlabs.com/). This model is SFT-finetuned on alpaca format 11k medical dataset over the base model [JSL-MedMNX-7B](https://huggingface.co/johnsnowlabs/JSL-MedMNX-7B). Checkout the perofrmance on [Open Medical LLM Leaderboard](https://huggingface.co/spaces/openlifescienceai/open_medical_llm_leaderboard). This model is available under a [CC-BY-NC-ND](https://creativecommons.org/licenses/by-nc-nd/4.0/deed.en) license and must also conform to this [Acceptable Use Policy](https://huggingface.co/johnsnowlabs). If you need to license this model for commercial use, please contact us at info@johnsnowlabs.com. ## 💻 Usage ```python !pip install -qU transformers accelerate from transformers import AutoTokenizer import transformers import torch model = "johnsnowlabs/JSL-MedMNX-7B-SFT" messages = [{"role": "user", "content": "What is a large language model?"}] tokenizer = AutoTokenizer.from_pretrained(model) prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) pipeline = transformers.pipeline( "text-generation", model=model, torch_dtype=torch.float16, device_map="auto", ) outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95) print(outputs[0]["generated_text"]) ``` ## 🏆 Evaluation | Tasks |Version|Filter|n-shot| Metric |Value | |Stderr| |-------------------------------|-------|------|-----:|--------|-----:|---|-----:| |stem |N/A |none | 0|acc_norm|0.5209|± |0.0068| | | |none | 0|acc |0.5675|± |0.0058| | - medmcqa |Yaml |none | 0|acc |0.5152|± |0.0077| | | |none | 0|acc_norm|0.5152|± |0.0077| | - medqa_4options |Yaml |none | 0|acc |0.5397|± |0.0140| | | |none | 0|acc_norm|0.5397|± |0.0140| | - anatomy (mmlu) | 0|none | 0|acc |0.6593|± |0.0409| | - clinical_knowledge (mmlu) | 0|none | 0|acc |0.7245|± |0.0275| | - college_biology (mmlu) | 0|none | 0|acc |0.7431|± |0.0365| | - college_medicine (mmlu) | 0|none | 0|acc |0.6532|± |0.0363| | - medical_genetics (mmlu) | 0|none | 0|acc |0.7300|± |0.0446| | - professional_medicine (mmlu)| 0|none | 0|acc |0.7206|± |0.0273| | - pubmedqa | 1|none | 0|acc |0.7720|± |0.0188| |Groups|Version|Filter|n-shot| Metric |Value | |Stderr| |------|-------|------|-----:|--------|-----:|---|-----:| |stem |N/A |none | 0|acc_norm|0.5209|± |0.0068| | | |none | 0|acc |0.5675|± |0.0058|