File size: 362 Bytes
9bad7ac 819cbc5 |
1 2 3 4 5 6 7 8 |
## AMRBART (large-sized model)
AMRBART model is continual pre-trained on the English text and AMR Graphs based on BART model. It was introduced in the paper: [Graph Pre-training for AMR Parsing and Generation](https://arxiv.org/pdf/2203.07836.pdf) by bai et al. and first released in [this repository](https://github.com/muyeby/AMRBART).
---
license: mit
---
|