File size: 1,997 Bytes
630546b 8a72d7b e532983 f8b14fe 630546b 8a72d7b 327c0b4 95c2f00 8a72d7b 95c2f00 8a72d7b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 |
---
license: llama2
language:
- en
datasets:
- AGBonnet/augmented-clinical-notes
base_model: epfl-llm/llama-2-13b-hf
pipeline_tag: text2text-generation
---
<img width=20% src="medinote.png" title="logo">
# Model Card for MediNote-13B-v1.0
MediNote is a suite of open-source medical Large Language Models (LLMs) fine-tuned for clinical note generation from the [MediTron](https://arxiv.org/abs/2311.16079) foundation model.
MediNote-13B is a 13 billion parameters model trained to generate clinical notes from doctor-patient conversations.
## Model Details
- **Developed by:** [Antoine Bonnet](https://huggingface.co/AGBonnet) and Paul Boulenger
- **Model type:** Causal decoder-only transformer language model
- **Language(s):** English only
- **Model License:** [LLAMA 2 COMMUNITY LICENSE AGREEMENT](https://huggingface.co/meta-llama/Llama-2-70b/raw/main/LICENSE.txt)
- **Code License:** [MIT](https://opensource.org/license/mit/)
- **Fine-tuned from model:** [Llama-2-13B](https://huggingface.co/meta-llama/Llama-2-13b-hf) with continued pre-training on PubMed Central (MediTron-13B equivalent)
- **Context length:** 2K tokens
- **Input:** Text-only data
- **Output:** Model generates text only
### Model Sources
- **Repository:** [EPFL-IC-Make-Team/ClinicalNotes](https://github.com/EPFL-IC-Make-Team/ClinicalNotes)
- **Trainer:** [epflLLM/Megatron-LLM](https://github.com/epfLLM/Megatron-LLM)
- **Paper:** *[MediNote: Automatic Clinical Notes]()*
## Uses
### Direct Use
It is possible to use this model to generate clinical notes, which is useful for experimentation and understanding its capabilities.
It should not be used directly for production or work that may impact people.
### Downstream Use
### Out-of-Scope Use
We do not recommend using this model for natural language generation in a production environment, finetuned or otherwise.
### Recommendations
## Citation
**BibTeX:**
If you use MediNote or its training data, please cite our work:
```
ADD CITATION
``` |