1 ---
2 tags:
3 - question-answering
4 - roberta
5 - adapter-transformers
6 datasets:
7 - drop
8 language:
9 - en
10 ---
11
12 # Adapter `AdapterHub/roberta-base-pf-drop` for roberta-base
13
14 An [adapter](https://adapterhub.ml) for the `roberta-base` model that was trained on the [drop](https://huggingface.co/datasets/drop/) dataset and includes a prediction head for question answering.
15
16 This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
17
18 ## Usage
19
20 First, install `adapter-transformers`:
21
22 ```
23 pip install -U adapter-transformers
24 ```
25 _Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. [More](https://docs.adapterhub.ml/installation.html)_
26
27 Now, the adapter can be loaded and activated like this:
28
29 ```python
30 from transformers import AutoModelWithHeads
31
32 model = AutoModelWithHeads.from_pretrained("roberta-base")
33 adapter_name = model.load_adapter("AdapterHub/roberta-base-pf-drop", source="hf")
34 model.active_adapters = adapter_name
35 ```
36
37 ## Architecture & Training
38
39 The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer.
40 In particular, training configurations for all tasks can be found [here](https://github.com/adapter-hub/efficient-task-transfer/tree/master/run_configs).
41
42
43 ## Evaluation results
44
45 Refer to [the paper](https://arxiv.org/pdf/2104.08247) for more information on results.
46
47 ## Citation
48
49 If you use this adapter, please cite our paper ["What to Pre-Train on? Efficient Intermediate Task Selection"](https://arxiv.org/pdf/2104.08247):
50
51 ```bibtex
52 @inproceedings{poth-etal-2021-pre,
53 title = "{W}hat to Pre-Train on? {E}fficient Intermediate Task Selection",
54 author = {Poth, Clifton and
55 Pfeiffer, Jonas and
56 R{"u}ckl{'e}, Andreas and
57 Gurevych, Iryna},
58 booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
59 month = nov,
60 year = "2021",
61 address = "Online and Punta Cana, Dominican Republic",
62 publisher = "Association for Computational Linguistics",
63 url = "https://aclanthology.org/2021.emnlp-main.827",
64 pages = "10585--10605",
65 }
66 ```