calpt's picture
Add adapter bert-base-multilingual-cased_pt_wiki_pfeiffer version 1
5547e96 verified
---
tags:
- adapter-transformers
- fill-mask
- bert
- adapterhub:pt/wiki
language:
- pt
license: "apache-2.0"
---
# Adapter `bert-base-multilingual-cased_pt_wiki_pfeiffer` for bert-base-multilingual-cased
An adapter for the `bert-base-multilingual-cased` model, trained on the [pt/wiki](https://adapterhub.ml/explore/pt/wiki/) dataset and includes a prediction head for masked lm.
**This adapter was created for usage with the [Adapters](https://github.com/Adapter-Hub/adapters) library.**
## Usage
First, install `adapters`:
```
pip install -U adapters
```
Now, the adapter can be loaded and activated like this:
```python
from adapters import AutoAdapterModel
model = AutoAdapterModel.from_pretrained("bert-base-multilingual-cased")
adapter_name = model.load_adapter("AdapterHub/bert-base-multilingual-cased_pt_wiki_pfeiffer")
model.set_active_adapters(adapter_name)
```
## Architecture & Training
- Adapter architecture: pfeiffer
- Prediction head: masked lm
- Dataset: [pt/wiki](https://adapterhub.ml/explore/pt/wiki/)
## Author Information
- Author name(s): Jonas Pfeiffer
- Author email: joans@pfeiffer.ai
- Author links: [Website](https://pfeiffer.ai), [GitHub](https://github.com/JoPfeiff), [Twitter](https://twitter.com/@PfeiffJo)
## Citation
```bibtex
```
*This adapter has been auto-imported from https://github.com/Adapter-Hub/Hub/blob/master/adapters/ukp/bert-base-multilingual-cased_pt_wiki_pfeiffer.yaml*.