1 ---
2 tags:
3 - bert
4 - adapterhub:comsense/copa
5 - adapter-transformers
6 language:
7 - en
8 ---
9
10 # Adapter `AdapterHub/bert-base-uncased-pf-copa` for bert-base-uncased
11
12 An [adapter](https://adapterhub.ml) for the `bert-base-uncased` model that was trained on the [comsense/copa](https://adapterhub.ml/explore/comsense/copa/) dataset and includes a prediction head for multiple choice.
13
14 This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
15
16 ## Usage
17
18 First, install `adapter-transformers`:
19
20 ```
21 pip install -U adapter-transformers
22 ```
23 _Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. [More](https://docs.adapterhub.ml/installation.html)_
24
25 Now, the adapter can be loaded and activated like this:
26
27 ```python
28 from transformers import AutoModelWithHeads
29
30 model = AutoModelWithHeads.from_pretrained("bert-base-uncased")
31 adapter_name = model.load_adapter("AdapterHub/bert-base-uncased-pf-copa", source="hf")
32 model.active_adapters = adapter_name
33 ```
34
35 ## Architecture & Training
36
37 The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer.
38 In particular, training configurations for all tasks can be found [here](https://github.com/adapter-hub/efficient-task-transfer/tree/master/run_configs).
39
40
41 ## Evaluation results
42
43 Refer to [the paper](https://arxiv.org/pdf/2104.08247) for more information on results.
44
45 ## Citation
46
47 If you use this adapter, please cite our paper ["What to Pre-Train on? Efficient Intermediate Task Selection"](https://arxiv.org/pdf/2104.08247):
48
49 ```bibtex
50 @inproceedings{poth-etal-2021-what-to-pre-train-on,
51 title={What to Pre-Train on? Efficient Intermediate Task Selection},
52 author={Clifton Poth and Jonas Pfeiffer and Andreas Rücklé and Iryna Gurevych},
53 booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
54 month = nov,
55 year = "2021",
56 address = "Online",
57 publisher = "Association for Computational Linguistics",
58 url = "https://arxiv.org/abs/2104.08247",
59 pages = "to appear",
60 }
61 ```