calpt's picture
Add adapter roberta-base-cola_houlsby version 1
9119faf verified
|
raw
history blame
1.69 kB
metadata
tags:
  - text-classification
  - adapter-transformers
  - adapterhub:lingaccept/cola
  - roberta
license: apache-2.0

Adapter roberta-base-cola_houlsby for roberta-base

Adapter (with head) trained using the run_glue.py script with an extension that retains the best checkpoint (out of 30 epochs).

This adapter was created for usage with the Adapters library.

Usage

First, install adapters:

pip install -U adapters

Now, the adapter can be loaded and activated like this:

from adapters import AutoAdapterModel

model = AutoAdapterModel.from_pretrained("roberta-base")
adapter_name = model.load_adapter("AdapterHub/roberta-base-cola_houlsby")
model.set_active_adapters(adapter_name)

Architecture & Training

  • Adapter architecture: houlsby
  • Prediction head: classification
  • Dataset: CoLA

Author Information

Citation

@article{pfeiffer2020AdapterHub,
    title={AdapterHub: A Framework for Adapting Transformers},
    author={Jonas Pfeiffer,
            Andreas R\"uckl\'{e},
            Clifton Poth,
            Aishwarya Kamath,
            Ivan Vuli\'{c},
            Sebastian Ruder,
            Kyunghyun Cho,
            Iryna Gurevych},
    journal={ArXiv},
    year={2020}
}

This adapter has been auto-imported from https://github.com/Adapter-Hub/Hub/blob/master/adapters/ukp/roberta-base-cola_houlsby.yaml.