Edit model card

Adapter roberta-base-cola_pfeiffer for roberta-base

Adapter (with head) trained using the run_glue.py script with an extension that retains the best checkpoint (out of 30 epochs).

This adapter was created for usage with the Adapters library.

Usage

First, install adapters:

pip install -U adapters

Now, the adapter can be loaded and activated like this:

from adapters import AutoAdapterModel

model = AutoAdapterModel.from_pretrained("roberta-base")
adapter_name = model.load_adapter("AdapterHub/roberta-base-cola_pfeiffer")
model.set_active_adapters(adapter_name)

Architecture & Training

  • Adapter architecture: pfeiffer
  • Prediction head: classification
  • Dataset: CoLA

Author Information

Citation

@article{pfeiffer2020AdapterHub,
    title={AdapterHub: A Framework for Adapting Transformers},
    author={Jonas Pfeiffer,
            Andreas R\"uckl\'{e},
            Clifton Poth,
            Aishwarya Kamath,
            Ivan Vuli\'{c},
            Sebastian Ruder,
            Kyunghyun Cho,
            Iryna Gurevych},
    journal={ArXiv},
    year={2020}
}

This adapter has been auto-imported from https://github.com/Adapter-Hub/Hub/blob/master/adapters/ukp/roberta-base-cola_pfeiffer.yaml.

Downloads last month
1
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.