File size: 1,170 Bytes
6a38021
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
---
language: en
tags:
- question-answering
---

# ReAtt
ReAtt is a retrieval-augmented model for knowledge-intensive tasks proposed in [Retrieval as Attention: End-to-end Learning of Retrieval and Reading within a Single Transformer](https://arxiv.org/pdf/2212.02027.pdf). The original Github repository is [https://github.com/jzbjyb/ReAtt](https://github.com/jzbjyb/ReAtt).

## Description
`neulab/reatt-large-nq` (based on T5 architecture) is initialized with `google/t5-large-lm-adapt` and fine-tuned on Natural Questions with end-to-end retrieval-augmented training.

## Usage
Please refer to [https://github.com/jzbjyb/ReAtt](https://github.com/jzbjyb/ReAtt) for instructions to use this model.

## Reference

```bibtex
@inproceedings{jiang-etal-2022-reatt,
    title = {Retrieval as Attention: End-to-end Learning of Retrieval and Reading within a Single Transformer},
    author = {Zhengbao Jiang and Luyu Gao and Jun Araki and Haibo Ding and Zhiruo Wang and Jamie Callan and Graham Neubig},
    booktitle = {Conference on Empirical Methods in Natural Language Processing (EMNLP)},
    address = {Abu Dhabi, UAE},
    month = {December},
    year = {2022}
}
```