File size: 369 Bytes
9bad7ac
 
4fee67d
1838b9d
 
819cbc5
 
 
1
2
3
4
5
6
7
8
9
## AMRBART (large-sized model)

AMRBART model is continually pre-trained on the English text and AMR Graphs based on the BART model. It was introduced in the paper: [Graph Pre-training for AMR Parsing and Generation](https://arxiv.org/pdf/2203.07836.pdf) by bai et al. and first released in [this repository](https://github.com/muyeby/AMRBART). 


---
license: mit
---