File size: 1,427 Bytes
38cfb04 5c24563 38cfb04 5c24563 38cfb04 5c24563 38cfb04 5c24563 38cfb04 5c24563 38cfb04 5c24563 38cfb04 5c24563 38cfb04 5c24563 38cfb04 5c24563 38cfb04 5c24563 38cfb04 5c24563 38cfb04 5c24563 38cfb04 5c24563 38cfb04 5c24563 38cfb04 5c24563 38cfb04 5c24563 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 |
---
tags:
- bert
- adapterhub:Arabic ABSA/SemEvalHotelReview
- adapter-transformers
datasets:
- Hotel
---
# Adapter `salohnana2018/ABSA-SentencePair-domainAdapt-SemEval-Adapter-pfeiffer_madx-run2` for CAMeL-Lab/bert-base-arabic-camelbert-msa
An [adapter](https://adapterhub.ml) for the `CAMeL-Lab/bert-base-arabic-camelbert-msa` model that was trained on the [Arabic ABSA/SemEvalHotelReview](https://adapterhub.ml/explore/Arabic ABSA/SemEvalHotelReview/) dataset and includes a prediction head for classification.
This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
## Usage
First, install `adapter-transformers`:
```
pip install -U adapter-transformers
```
_Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. [More](https://docs.adapterhub.ml/installation.html)_
Now, the adapter can be loaded and activated like this:
```python
from transformers import AutoAdapterModel
model = AutoAdapterModel.from_pretrained("CAMeL-Lab/bert-base-arabic-camelbert-msa")
adapter_name = model.load_adapter("salohnana2018/ABSA-SentencePair-domainAdapt-SemEval-Adapter-pfeiffer_madx-run2", source="hf", set_active=True)
```
## Architecture & Training
<!-- Add some description here -->
## Evaluation results
<!-- Add some description here -->
## Citation
<!-- Add some description here --> |