--- license: llama2 language: - en datasets: - AGBonnet/augmented-clinical-notes base_model: epfl-llm/meditron-7b --- # Model Card for MediNote-7B-v1.0 MediNote is a suite of open-source medical Large Language Models (LLMs) fine-tuned for clinical note generation from the [Meditron](https://arxiv.org/abs/2311.16079) foundation model. MediNote-7B is a 7 billion parameters model trained to generate clinical notes from doctor-patient conversations. ## Model Details - **Developed by:** [Antoine Bonnet](https://huggingface.co/AGBonnet) and Paul Boulenger - **Model type:** Causal decoder-only transformer language model - **Language(s):** English only - **Model License:** [LLAMA 2 COMMUNITY LICENSE AGREEMENT](https://huggingface.co/meta-llama/Llama-2-70b/raw/main/LICENSE.txt) - **Code License:** [MIT](https://opensource.org/license/mit/) - **Fine-tuned from model:** [Meditron-7B.v1.0](https://huggingface.co/epfl-llm/meditron-7b) - **Context length:** 2K tokens - **Input:** Patient-doctor conversation transcripts (text) - **Output:** Clinical notes (text) - **Repository:** [EPFL-IC-Make-Team/ClinicalNotes](https://github.com/EPFL-IC-Make-Team/ClinicalNotes) - **Trainer:** [epflLLM/Megatron-LLM](https://github.com/epfLLM/Megatron-LLM) - **Paper:** *[MediNote: Automatic Clinical Notes]()*

Model pipeline

### ## Uses ### Direct Use It is possible to use this model to generate clinical notes, which is useful for experimentation and understanding its capabilities. It should not be used directly for production or work that may impact people. ### Downstream Use ### Out-of-Scope Use We do not recommend using this model for natural language generation in a production environment, finetuned or otherwise. ### Recommendations ## Citation **BibTeX:** If you use MediNote or its training data, please cite our work: ``` ADD CITATION ```