Iker's picture
Update README.md
48ca953 verified
---
language:
- en
- es
- fr
- it
license: apache-2.0
pretty_name: Multilingual Medical Corpus
tags:
- medical
dataset_info:
features:
- name: text
dtype: string
splits:
- name: en
num_bytes: 7672665166
num_examples: 21226237
- name: es
num_bytes: 6245812986
num_examples: 35444286
- name: fr
num_bytes: 4763269707
num_examples: 7192779
- name: it
num_bytes: 1021535232
num_examples: 3504555
download_size: 10530951092
dataset_size: 19703283091
configs:
- config_name: default
data_files:
- split: en
path: data/en-*
- split: es
path: data/es-*
- split: fr
path: data/fr-*
- split: it
path: data/it-*
---
<p align="center">
<br>
<img src="http://www.ixa.eus/sites/default/files/anitdote.png" style="width: 30%;">
<h2 align="center">Mutilingual Medical Corpus</h2>
<be>
<p align="justify">
Multilingual-Medical-Corpus a 3 billion word multilingual corpus for training LLMs adapted to the medical domain. Multilingual-Medical-Corpus includes four languages, namely, English, Spanish, French, and Italian.
</p>
- 📖 Paper: [Medical mT5: An Open-Source Multilingual Text-to-Text LLM for The Medical Domain](https://arxiv.org/abs/2404.07613)
- 🌐 Project Website: [https://univ-cotedazur.eu/antidote](https://univ-cotedazur.eu/antidote)
# Corpus Description
- **Developed by**: Iker García-Ferrero, Rodrigo Agerri, Aitziber Atutxa Salazar, Elena Cabrio, Iker de la Iglesia, Alberto Lavelli, Bernardo Magnini, Benjamin Molinet, Johana Ramirez-Romero, German Rigau, Jose Maria Villa-Gonzalez, Serena Villata and Andrea Zaninello
- **Contact**: [Iker García-Ferrero](https://ikergarcia1996.github.io/Iker-Garcia-Ferrero/) and [Rodrigo Agerri](https://ragerri.github.io/)
- **Website**: [https://univ-cotedazur.eu/antidote](https://univ-cotedazur.eu/antidote)
- **Funding**: CHIST-ERA XAI 2019 call. Antidote (PCI2020-120717-2) funded by MCIN/AEI /10.13039/501100011033 and by European Union NextGenerationEU/PRTR
- **Language(s) (NLP)**: English, Spanish, French, Italian
- **License**: apache-2.0
<table border="1" cellspacing="0" cellpadding="5">
<caption>Data sources and word counts by language.</caption>
<thead>
<tr>
<th>Language</th>
<th>Source</th>
<th>Words</th>
</tr>
</thead>
<tbody>
<tr>
<td rowspan="3">English</td>
<td>ClinicalTrials</td>
<td>127.4M</td>
</tr>
<tr>
<td>EMEA</td>
<td>12M</td>
</tr>
<tr>
<td>PubMed</td>
<td>968.4M</td>
</tr>
<tr>
<td rowspan="6">Spanish</td>
<td>EMEA</td>
<td>13.6M</td>
</tr>
<tr>
<td>PubMed</td>
<td>8.4M</td>
</tr>
<tr>
<td>Medical Crawler</td>
<td>918M</td>
</tr>
<tr>
<td>SPACC</td>
<td>350K</td>
</tr>
<tr>
<td>UFAL</td>
<td>10.5M</td>
</tr>
<tr>
<td>WikiMed</td>
<td>5.2M</td>
</tr>
<tr>
<td rowspan="5">French</td>
<td>PubMed</td>
<td>1.4M</td>
</tr>
<tr>
<td>Science Direct</td>
<td>15.2M</td>
</tr>
<tr>
<td>Wikipedia - Médecine</td>
<td>5M</td>
</tr>
<tr>
<td>EDP</td>
<td>48K</td>
</tr>
<tr>
<td>Google Patents</td>
<td>654M</td>
</tr>
<tr>
<td rowspan="13">Italian</td>
<td>Medical Commoncrawl - IT</td>
<td>67M</td>
</tr>
<tr>
<td>Drug instructions</td>
<td>30.5M</td>
</tr>
<tr>
<td>Wikipedia - Medicina</td>
<td>13.3M</td>
</tr>
<tr>
<td>E3C Corpus - IT</td>
<td>11.6M</td>
</tr>
<tr>
<td>Medicine descriptions</td>
<td>6.3M</td>
</tr>
<tr>
<td>Medical theses</td>
<td>5.8M</td>
</tr>
<tr>
<td>Medical websites</td>
<td>4M</td>
</tr>
<tr>
<td>PubMed</td>
<td>2.3M</td>
</tr>
<tr>
<td>Supplement description</td>
<td>1.3M</td>
</tr>
<tr>
<td>Medical notes</td>
<td>975K</td>
</tr>
<tr>
<td>Pathologies</td>
<td>157K</td>
</tr>
<tr>
<td>Medical test simulations</td>
<td>26K</td>
</tr>
<tr>
<td>Clinical cases</td>
<td>20K</td>
</tr>
</tbody>
</table>
# Open Source Models trained with Multilingual-Medical-Corpus:
<table border="1" cellspacing="0" cellpadding="5">
<thead>
<tr>
<th></th>
<th><a href="https://huggingface.co/HiTZ/Medical-mT5-large">HiTZ/Medical-mT5-large</a></th>
<th><a href="https://huggingface.co/HiTZ/Medical-mT5-xl">HiTZ/Medical-mT5-xl</a></th>
<th><a href="https://huggingface.co/HiTZ/Medical-mT5-large-multitask">HiTZ/Medical-mT5-large-multitask</a></th>
<th><a href="https://huggingface.co/HiTZ/Medical-mT5-xl-multitask">HiTZ/Medical-mT5-xl-multitask</a></th>
</tr>
</thead>
<tbody>
<tr>
<td>Param. no.</td>
<td>738M</td>
<td>3B</td>
<td>738M</td>
<td>3B</td>
</tr>
<tr>
<td>Task</td>
<td>Language Modeling</td>
<td>Language Modeling</td>
<td>Multitask Sequence Labeling</td>
<td>Multitask Sequence Labeling</td>
</tr>
<tr>
</tbody>
</table>
## Citation
```bibtext
@misc{garcíaferrero2024medical,
title={Medical mT5: An Open-Source Multilingual Text-to-Text LLM for The Medical Domain},
author={Iker García-Ferrero and Rodrigo Agerri and Aitziber Atutxa Salazar and Elena Cabrio and Iker de la Iglesia and Alberto Lavelli and Bernardo Magnini and Benjamin Molinet and Johana Ramirez-Romero and German Rigau and Jose Maria Villa-Gonzalez and Serena Villata and Andrea Zaninello},
year={2024},
eprint={2404.07613},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```