Edit model card

Adapter BigTMiami/micro_helpfulness_tapt_pretrain_seq_bn_adapter for roberta-base

An adapter for the roberta-base model that was trained on the BigTMiami/amazon_MICRO_helpfulness_dataset_condensed dataset and includes a prediction head for masked lm.

This adapter was created for usage with the Adapters library.

Usage

First, install adapters:

pip install -U adapters

Now, the adapter can be loaded and activated like this:

from adapters import AutoAdapterModel

model = AutoAdapterModel.from_pretrained("roberta-base")
adapter_name = model.load_adapter("BigTMiami/micro_helpfulness_tapt_pretrain_seq_bn_adapter", source="hf", set_active=True)

Architecture & Training

Evaluation results

Citation

Downloads last month
6
Unable to determine this model’s pipeline type. Check the docs .

Dataset used to train BigTMiami/micro_helpfulness_tapt_pretrain_seq_bn_adapter