lenglaender commited on
Commit
c255ff2
1 Parent(s): b8c2ca8

Upload model

Browse files
Files changed (1) hide show
  1. README.md +29 -14
README.md CHANGED
@@ -6,43 +6,58 @@ datasets:
6
  - UKPLab/m2qa
7
  ---
8
 
9
- # Adapter `AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-creative-writing` for xlm-roberta-base
 
 
 
 
10
 
11
- An [adapter](https://adapterhub.ml) for the `xlm-roberta-base` model that was trained on the [UKPLab/m2qa](https://huggingface.co/datasets/UKPLab/m2qa/) dataset.
 
 
12
 
13
- This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
14
 
15
  ## Usage
16
 
17
- First, install `adapter-transformers`:
18
 
19
  ```
20
- pip install -U adapter-transformers
21
  ```
22
- _Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. [More](https://docs.adapterhub.ml/installation.html)_
23
 
24
  Now, the adapter can be loaded and activated like this:
25
 
26
  ```python
27
- from transformers import AutoAdapterModel
 
28
 
29
  model = AutoAdapterModel.from_pretrained("xlm-roberta-base")
30
- adapter_name = model.load_adapter("AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-creative-writing", source="hf", set_active=True)
31
- ```
32
 
33
- ## Architecture & Training
 
34
 
 
 
 
 
 
 
 
 
 
35
 
36
- See our repository for more information: See https://github.com/UKPLab/m2qa/tree/main/Experiments/mad-x-domain
37
 
 
38
 
39
- ## Evaluation results
40
 
41
- <!-- Add some description here -->
 
 
 
 
42
 
43
  ## Citation
44
 
45
-
46
  ```
47
  @article{englaender-etal-2024-m2qa,
48
  title="M2QA: Multi-domain Multilingual Question Answering",
 
6
  - UKPLab/m2qa
7
  ---
8
 
9
+ # M2QA Adapter: Domain Adapter for MAD-X+Domain Setup
10
+ This adapter is part of the M2QA publication to achieve language and domain transfer via adapters.
11
+ 📃 Paper: [https://arxiv.org/abs/2407.01091](https://arxiv.org/abs/2407.01091)
12
+ 🏗️ GitHub repo: [https://github.com/UKPLab/m2qa](https://github.com/UKPLab/m2qa)
13
+ 💾 Hugging Face Dataset: [https://huggingface.co/UKPLab/m2qa](https://huggingface.co/UKPLab/m2qa)
14
 
15
+ **Important:** This adapter only works together with the MAD-X language adapters and the M2QA QA head adapter.
16
+
17
+ This [adapter](https://adapterhub.ml) for the `xlm-roberta-base` model that was trained using the **[Adapters](https://github.com/Adapter-Hub/adapters)** library. For detailed training details see our paper or GitHub repository: [https://github.com/UKPLab/m2qa](https://github.com/UKPLab/m2qa). You can find the evaluation results for this adapter on the M2QA dataset in the GitHub repo and in the paper.
18
 
 
19
 
20
  ## Usage
21
 
22
+ First, install `adapters`:
23
 
24
  ```
25
+ pip install -U adapters
26
  ```
 
27
 
28
  Now, the adapter can be loaded and activated like this:
29
 
30
  ```python
31
+ from adapters import AutoAdapterModel
32
+ from adapters.composition import Stack
33
 
34
  model = AutoAdapterModel.from_pretrained("xlm-roberta-base")
 
 
35
 
36
+ # 1. Load language adapter
37
+ language_adapter_name = model.load_adapter("de/wiki@ukp") # MAD-X+Domain uses the MAD-X language adapter
38
 
39
+ # 2. Load domain adapter
40
+ domain_adapter_name = model.load_adapter("AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-creative-writing")
41
+
42
+ # 3. Load QA head adapter
43
+ qa_adapter_name = model.load_adapter("AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-qa-head")
44
+
45
+ # 4. Activate them via the adapter stack
46
+ model.active_adapters = Stack(language_adapter_name, domain_adapter_name, qa_adapter_name)
47
+ ```
48
 
 
49
 
50
+ See our repository for more information: See https://github.com/UKPLab/m2qa/tree/main/Experiments/mad-x-domain
51
 
 
52
 
53
+ ## Contact
54
+ Leon Engländer:
55
+ - [HuggingFace Profile](https://huggingface.co/lenglaender)
56
+ - [GitHub](https://github.com/lenglaender)
57
+ - [Twitter](https://x.com/LeonEnglaender)
58
 
59
  ## Citation
60
 
 
61
  ```
62
  @article{englaender-etal-2024-m2qa,
63
  title="M2QA: Multi-domain Multilingual Question Answering",