Adapters for the paper "M2QA: Multi-domain Multilingual Question Answering".
We evaluate 2 setups: MAD-X+Domain and MAD-X²
AdapterHub
university
AI & ML interests
Parameter-Efficient Fine-Tuning
Recent Activity
View all activity
Organization Card
Adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
💻 Website • 📚 Documentation • 📜 Paper • 🧪 Notebook Tutorials
Adapters is an add-on library to HuggingFace's Transformers, integrating various adapter methods into state-of-the-art pre-trained language models with minimal coding overhead for training and inference.
pip install adapters
🤗 Hub integration: https://docs.adapterhub.ml/huggingface_hub.html
Collections
6
MAD-X language adapters from the paper "MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer" for BERT and XLM-RoBERTa.
-
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
Paper • 2005.00052 • Published • 1 -
AdapterHub/xlm-roberta-base-de-wiki_pfeiffer
Updated • 6 -
AdapterHub/bert-base-multilingual-cased-mhr-wiki_houlsby
Updated • 5 -
AdapterHub/xlm-roberta-large-sw-wiki_pfeiffer
Updated • 7
models
505
AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-wiki
Updated
•
9
AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-news
Updated
•
6
AdapterHub/m2qa-xlm-roberta-base-mad-x-2-english
Updated
•
7
AdapterHub/m2qa-xlm-roberta-base-mad-x-2-chinese
Updated
•
6
AdapterHub/m2qa-xlm-roberta-base-mad-x-2-wiki
Updated
•
12
AdapterHub/m2qa-xlm-roberta-base-mad-x-2-news
Updated
•
7
AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-product-reviews
Updated
•
2
AdapterHub/m2qa-xlm-roberta-base-mad-x-2-product-reviews
Updated
•
6
AdapterHub/m2qa-xlm-roberta-base-mad-x-2-creative-writing
Updated
•
7
AdapterHub/m2qa-xlm-roberta-base-mad-x-2-turkish
Updated
•
7
datasets
None public yet