Adapter bert-base-multilingual-cased_wikiann_ner_ja_pfeiffer
for bert-base-multilingual-cased
Stacked adapter on top of Language adapter. MAD-X 2.0 style. The language adapters in the last layer (layer 11) are deleted.
This adapter was created for usage with the Adapters library.
Usage
First, install adapters
:
pip install -U adapters
Now, the adapter can be loaded and activated like this:
from adapters import AutoAdapterModel
model = AutoAdapterModel.from_pretrained("bert-base-multilingual-cased")
adapter_name = model.load_adapter("AdapterHub/bert-base-multilingual-cased_wikiann_ner_ja_pfeiffer")
model.set_active_adapters(adapter_name)
Architecture & Training
- Adapter architecture: pfeiffer
- Prediction head: tagging
- Dataset: Japanese
Author Information
- Author name(s): Jonas Pfeiffer
- Author email: Jonas@Pfeiffer.ai
- Author links: Website, GitHub, Twitter
Versions
1
(main)2
3
4
5
Citation
@article{Pfeiffer21UNKs,
author = {Jonas Pfeiffer and
Ivan Vuli\'{c} and
Iryna Gurevych and
Sebastian Ruder},
title = {{UNKs Everywhere: Adapting Multilingual Language Models to New Scripts}},
journal = {arXiv preprint},
year = {2021} ,
url = {https://arxiv.org/abs/2012.15562}
}
This adapter has been auto-imported from https://github.com/Adapter-Hub/Hub/blob/master/adapters/ukp/bert-base-multilingual-cased_wikiann_ner_ja_pfeiffer.yaml.
- Downloads last month
- 1
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.