NarbioBART / README.md
Narrativa's picture
Update README.md
a86521c
|
raw
history blame
2.32 kB
---
language: es
tags:
- Spanish
- BART
- biology
- medical
- seq2seq
license: mit
thumbnail: https://huggingface.co/Narrativa/NarbioBART/resolve/main/NarbioBART-logo.png
---
<div style="text-align:center;width:250px;height:250px;">
<img src="https://huggingface.co/Narrativa/NarbioBART/resolve/main/NarbioBART-logo.png" alt="NarbioBART logo"">
</div>
## 🦠 NarbioBART 🏥
**NarbioBART** (base) is a BART-like model trained on [Spanish Biomedical Crawled Corpus](https://zenodo.org/record/5510033#.Yhdk1ZHMLJx).
BART is a transformer *encoder-decoder* (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder. BART is pre-trained by (1) corrupting text with an arbitrary noising function and (2) learning a model to reconstruct the original text.
This model is particularly effective when fine-tuned for text generation tasks (e.g., summarization, translation) but also works well for comprehension tasks (e.g., text classification, question answering).
## Training details
- Dataset: `Spanish Biomedical Crawled Corpus` - 90% for training / 10% for validation.
- Training script: see [here](https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_bart_dlm_flax.py)
## [Evaluation metrics](https://huggingface.co/mrm8488/bart-bio-base-es/tensorboard?params=scalars#frame) 🧾
|Metric | # Value |
|-------|---------|
|Accuracy| 0.802|
|Loss| 1.04|
## Benchmarks 🔨
WIP 🚧
## How to use with `transformers`
```py
from transformers import BartForConditionalGeneration, BartTokenizer
model_id = "Narrativa/NarbioBART"
model = BartForConditionalGeneration.from_pretrained(model_id, forced_bos_token_id=0)
tokenizer = BartTokenizer.from_pretrained(model_id)
def fill_mask_span(text):
batch = tokenizer(text, return_tensors="pt")
generated_ids = model.generate(batch["input_ids"])
print(tokenizer.batch_decode(generated_ids, skip_special_tokens=True))
text = "your text with a <mask> token."
fill_mask_span(text)
```
## Citation
```
@misc {narrativa_2023,
author = { {Narrativa} },
title = { NarbioBART (Revision c9a4e07) },
year = 2023,
url = { https://huggingface.co/Narrativa/NarbioBART },
doi = { 10.57967/hf/0500 },
publisher = { Hugging Face }
}
```