lenglaender's picture
Update README.md
132f48a verified
---
tags:
- adapter-transformers
- t5
datasets:
- amazon_polarity
---
# Adapter `lenglaender/xlm-roberta-base-lora-cls-amazon-polarity` for google-t5/t5-base
An [adapter](https://adapterhub.ml) for the `google-t5/t5-base` model that was trained on the [amazon_polarity](https://huggingface.co/datasets/amazon_polarity/) dataset and includes a prediction head for classification.
This adapter was created for usage with the **[Adapters](https://github.com/Adapter-Hub/adapters)** library.
## Usage
First, install `adapters`:
```
pip install -U adapters
```
Now, the adapter can be loaded and activated like this:
```python
from adapters import AutoAdapterModel
model = AutoAdapterModel.from_pretrained("google-t5/t5-base")
adapter_name = model.load_adapter("AdapterHub/xlm-roberta-base-lora-cls-amazon-polarity", source="hf", set_active=True)
```
## Architecture & Training
LoRA has r=8 and alpha=8 and was trained with dropout=0.1
## Evaluation results
Accuracy on Amazon polarity dataset: 95.91%
## Author Information
- Author name: Leon Engländer
- Author links: [GitHub](https://github.com/lenglaender), [Twitter](https://x.com/LeonEnglaender)