Model Name
stringlengths
5
122
URL
stringlengths
28
145
Crawled Text
stringlengths
1
199k
text
stringlengths
180
199k
AdapterHub/bert-base-uncased-pf-stsb
https://huggingface.co/AdapterHub/bert-base-uncased-pf-stsb
An adapter for the bert-base-uncased model that was trained on the sts/sts-b dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/bert-base-uncased-pf-stsb ### Model URL : https://huggingface.co/AdapterHub/bert-base-uncased-pf-stsb ### Model Description : An adapter for the bert-base-uncased model that was trained on the sts/sts-b dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/bert-base-uncased-pf-swag
https://huggingface.co/AdapterHub/bert-base-uncased-pf-swag
An adapter for the bert-base-uncased model that was trained on the swag dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/bert-base-uncased-pf-swag ### Model URL : https://huggingface.co/AdapterHub/bert-base-uncased-pf-swag ### Model Description : An adapter for the bert-base-uncased model that was trained on the swag dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/bert-base-uncased-pf-trec
https://huggingface.co/AdapterHub/bert-base-uncased-pf-trec
An adapter for the bert-base-uncased model that was trained on the trec dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/bert-base-uncased-pf-trec ### Model URL : https://huggingface.co/AdapterHub/bert-base-uncased-pf-trec ### Model Description : An adapter for the bert-base-uncased model that was trained on the trec dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/bert-base-uncased-pf-ud_deprel
https://huggingface.co/AdapterHub/bert-base-uncased-pf-ud_deprel
An adapter for the bert-base-uncased model that was trained on the deprel/ud_ewt dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/bert-base-uncased-pf-ud_deprel ### Model URL : https://huggingface.co/AdapterHub/bert-base-uncased-pf-ud_deprel ### Model Description : An adapter for the bert-base-uncased model that was trained on the deprel/ud_ewt dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/bert-base-uncased-pf-ud_en_ewt
https://huggingface.co/AdapterHub/bert-base-uncased-pf-ud_en_ewt
An adapter for the bert-base-uncased model that was trained on the dp/ud_ewt dataset and includes a prediction head for dependency parsing. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: This adapter was trained using adapter-transformer's example script for dependency parsing. See https://github.com/Adapter-Hub/adapter-transformers/tree/master/examples/dependency-parsing. Scores achieved by dependency parsing adapters on the test set of UD English EWT after training:
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/bert-base-uncased-pf-ud_en_ewt ### Model URL : https://huggingface.co/AdapterHub/bert-base-uncased-pf-ud_en_ewt ### Model Description : An adapter for the bert-base-uncased model that was trained on the dp/ud_ewt dataset and includes a prediction head for dependency parsing. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: This adapter was trained using adapter-transformer's example script for dependency parsing. See https://github.com/Adapter-Hub/adapter-transformers/tree/master/examples/dependency-parsing. Scores achieved by dependency parsing adapters on the test set of UD English EWT after training:
AdapterHub/bert-base-uncased-pf-ud_pos
https://huggingface.co/AdapterHub/bert-base-uncased-pf-ud_pos
An adapter for the bert-base-uncased model that was trained on the pos/ud_ewt dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/bert-base-uncased-pf-ud_pos ### Model URL : https://huggingface.co/AdapterHub/bert-base-uncased-pf-ud_pos ### Model Description : An adapter for the bert-base-uncased model that was trained on the pos/ud_ewt dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/bert-base-uncased-pf-wic
https://huggingface.co/AdapterHub/bert-base-uncased-pf-wic
An adapter for the bert-base-uncased model that was trained on the wordsence/wic dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/bert-base-uncased-pf-wic ### Model URL : https://huggingface.co/AdapterHub/bert-base-uncased-pf-wic ### Model Description : An adapter for the bert-base-uncased model that was trained on the wordsence/wic dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/bert-base-uncased-pf-wikihop
https://huggingface.co/AdapterHub/bert-base-uncased-pf-wikihop
An adapter for the bert-base-uncased model that was trained on the qa/wikihop dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/bert-base-uncased-pf-wikihop ### Model URL : https://huggingface.co/AdapterHub/bert-base-uncased-pf-wikihop ### Model Description : An adapter for the bert-base-uncased model that was trained on the qa/wikihop dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/bert-base-uncased-pf-winogrande
https://huggingface.co/AdapterHub/bert-base-uncased-pf-winogrande
An adapter for the bert-base-uncased model that was trained on the comsense/winogrande dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/bert-base-uncased-pf-winogrande ### Model URL : https://huggingface.co/AdapterHub/bert-base-uncased-pf-winogrande ### Model Description : An adapter for the bert-base-uncased model that was trained on the comsense/winogrande dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/bert-base-uncased-pf-wnut_17
https://huggingface.co/AdapterHub/bert-base-uncased-pf-wnut_17
An adapter for the bert-base-uncased model that was trained on the wnut_17 dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/bert-base-uncased-pf-wnut_17 ### Model URL : https://huggingface.co/AdapterHub/bert-base-uncased-pf-wnut_17 ### Model Description : An adapter for the bert-base-uncased model that was trained on the wnut_17 dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/bert-base-uncased-pf-yelp_polarity
https://huggingface.co/AdapterHub/bert-base-uncased-pf-yelp_polarity
An adapter for the bert-base-uncased model that was trained on the yelp_polarity dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/bert-base-uncased-pf-yelp_polarity ### Model URL : https://huggingface.co/AdapterHub/bert-base-uncased-pf-yelp_polarity ### Model Description : An adapter for the bert-base-uncased model that was trained on the yelp_polarity dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/bioASQyesno
https://huggingface.co/AdapterHub/bioASQyesno
An adapter for the facebook/bart-base model that was trained on the qa/bioasq dataset. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: Trained for 15 epochs with early stopping, a learning rate of 1e-4, and a batch size of 4 on the yes-no questions of the bioASQ 8b dataset. Achieved 75% accuracy on the test dataset of bioASQ 8b dataset.
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/bioASQyesno ### Model URL : https://huggingface.co/AdapterHub/bioASQyesno ### Model Description : An adapter for the facebook/bart-base model that was trained on the qa/bioasq dataset. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: Trained for 15 epochs with early stopping, a learning rate of 1e-4, and a batch size of 4 on the yes-no questions of the bioASQ 8b dataset. Achieved 75% accuracy on the test dataset of bioASQ 8b dataset.
AdapterHub/narrativeqa
https://huggingface.co/AdapterHub/narrativeqa
An adapter for the facebook/bart-base model that was trained on the qa/narrativeqa dataset. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this:
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/narrativeqa ### Model URL : https://huggingface.co/AdapterHub/narrativeqa ### Model Description : An adapter for the facebook/bart-base model that was trained on the qa/narrativeqa dataset. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this:
AdapterHub/roberta-base-pf-anli_r3
https://huggingface.co/AdapterHub/roberta-base-pf-anli_r3
An adapter for the roberta-base model that was trained on the anli dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-anli_r3 ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-anli_r3 ### Model Description : An adapter for the roberta-base model that was trained on the anli dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-art
https://huggingface.co/AdapterHub/roberta-base-pf-art
An adapter for the roberta-base model that was trained on the art dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-art ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-art ### Model Description : An adapter for the roberta-base model that was trained on the art dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-boolq
https://huggingface.co/AdapterHub/roberta-base-pf-boolq
An adapter for the roberta-base model that was trained on the qa/boolq dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-boolq ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-boolq ### Model Description : An adapter for the roberta-base model that was trained on the qa/boolq dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-cola
https://huggingface.co/AdapterHub/roberta-base-pf-cola
An adapter for the roberta-base model that was trained on the lingaccept/cola dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-cola ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-cola ### Model Description : An adapter for the roberta-base model that was trained on the lingaccept/cola dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-commonsense_qa
https://huggingface.co/AdapterHub/roberta-base-pf-commonsense_qa
An adapter for the roberta-base model that was trained on the comsense/csqa dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-commonsense_qa ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-commonsense_qa ### Model Description : An adapter for the roberta-base model that was trained on the comsense/csqa dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-comqa
https://huggingface.co/AdapterHub/roberta-base-pf-comqa
An adapter for the roberta-base model that was trained on the com_qa dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-comqa ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-comqa ### Model Description : An adapter for the roberta-base model that was trained on the com_qa dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-conll2000
https://huggingface.co/AdapterHub/roberta-base-pf-conll2000
An adapter for the roberta-base model that was trained on the chunk/conll2000 dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-conll2000 ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-conll2000 ### Model Description : An adapter for the roberta-base model that was trained on the chunk/conll2000 dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-conll2003
https://huggingface.co/AdapterHub/roberta-base-pf-conll2003
An adapter for the roberta-base model that was trained on the ner/conll2003 dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-conll2003 ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-conll2003 ### Model Description : An adapter for the roberta-base model that was trained on the ner/conll2003 dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-conll2003_pos
https://huggingface.co/AdapterHub/roberta-base-pf-conll2003_pos
An adapter for the roberta-base model that was trained on the pos/conll2003 dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-conll2003_pos ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-conll2003_pos ### Model Description : An adapter for the roberta-base model that was trained on the pos/conll2003 dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-copa
https://huggingface.co/AdapterHub/roberta-base-pf-copa
An adapter for the roberta-base model that was trained on the comsense/copa dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-copa ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-copa ### Model Description : An adapter for the roberta-base model that was trained on the comsense/copa dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-cosmos_qa
https://huggingface.co/AdapterHub/roberta-base-pf-cosmos_qa
An adapter for the roberta-base model that was trained on the comsense/cosmosqa dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-cosmos_qa ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-cosmos_qa ### Model Description : An adapter for the roberta-base model that was trained on the comsense/cosmosqa dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-cq
https://huggingface.co/AdapterHub/roberta-base-pf-cq
An adapter for the roberta-base model that was trained on the qa/cq dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-cq ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-cq ### Model Description : An adapter for the roberta-base model that was trained on the qa/cq dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-drop
https://huggingface.co/AdapterHub/roberta-base-pf-drop
An adapter for the roberta-base model that was trained on the drop dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-drop ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-drop ### Model Description : An adapter for the roberta-base model that was trained on the drop dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-duorc_p
https://huggingface.co/AdapterHub/roberta-base-pf-duorc_p
An adapter for the roberta-base model that was trained on the duorc dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-duorc_p ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-duorc_p ### Model Description : An adapter for the roberta-base model that was trained on the duorc dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-duorc_s
https://huggingface.co/AdapterHub/roberta-base-pf-duorc_s
An adapter for the roberta-base model that was trained on the duorc dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-duorc_s ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-duorc_s ### Model Description : An adapter for the roberta-base model that was trained on the duorc dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-emo
https://huggingface.co/AdapterHub/roberta-base-pf-emo
An adapter for the roberta-base model that was trained on the emo dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-emo ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-emo ### Model Description : An adapter for the roberta-base model that was trained on the emo dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-emotion
https://huggingface.co/AdapterHub/roberta-base-pf-emotion
An adapter for the roberta-base model that was trained on the emotion dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-emotion ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-emotion ### Model Description : An adapter for the roberta-base model that was trained on the emotion dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-fce_error_detection
https://huggingface.co/AdapterHub/roberta-base-pf-fce_error_detection
An adapter for the roberta-base model that was trained on the ged/fce dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-fce_error_detection ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-fce_error_detection ### Model Description : An adapter for the roberta-base model that was trained on the ged/fce dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-hellaswag
https://huggingface.co/AdapterHub/roberta-base-pf-hellaswag
An adapter for the roberta-base model that was trained on the comsense/hellaswag dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-hellaswag ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-hellaswag ### Model Description : An adapter for the roberta-base model that was trained on the comsense/hellaswag dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-hotpotqa
https://huggingface.co/AdapterHub/roberta-base-pf-hotpotqa
An adapter for the roberta-base model that was trained on the hotpot_qa dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-hotpotqa ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-hotpotqa ### Model Description : An adapter for the roberta-base model that was trained on the hotpot_qa dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-imdb
https://huggingface.co/AdapterHub/roberta-base-pf-imdb
An adapter for the roberta-base model that was trained on the sentiment/imdb dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-imdb ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-imdb ### Model Description : An adapter for the roberta-base model that was trained on the sentiment/imdb dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-mit_movie_trivia
https://huggingface.co/AdapterHub/roberta-base-pf-mit_movie_trivia
An adapter for the roberta-base model that was trained on the ner/mit_movie_trivia dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-mit_movie_trivia ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-mit_movie_trivia ### Model Description : An adapter for the roberta-base model that was trained on the ner/mit_movie_trivia dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-mnli
https://huggingface.co/AdapterHub/roberta-base-pf-mnli
An adapter for the roberta-base model that was trained on the nli/multinli dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-mnli ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-mnli ### Model Description : An adapter for the roberta-base model that was trained on the nli/multinli dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-mrpc
https://huggingface.co/AdapterHub/roberta-base-pf-mrpc
An adapter for the roberta-base model that was trained on the sts/mrpc dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-mrpc ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-mrpc ### Model Description : An adapter for the roberta-base model that was trained on the sts/mrpc dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-multirc
https://huggingface.co/AdapterHub/roberta-base-pf-multirc
An adapter for the roberta-base model that was trained on the rc/multirc dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-multirc ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-multirc ### Model Description : An adapter for the roberta-base model that was trained on the rc/multirc dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-newsqa
https://huggingface.co/AdapterHub/roberta-base-pf-newsqa
An adapter for the roberta-base model that was trained on the newsqa dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-newsqa ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-newsqa ### Model Description : An adapter for the roberta-base model that was trained on the newsqa dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-pmb_sem_tagging
https://huggingface.co/AdapterHub/roberta-base-pf-pmb_sem_tagging
An adapter for the roberta-base model that was trained on the semtag/pmb dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-pmb_sem_tagging ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-pmb_sem_tagging ### Model Description : An adapter for the roberta-base model that was trained on the semtag/pmb dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-qnli
https://huggingface.co/AdapterHub/roberta-base-pf-qnli
An adapter for the roberta-base model that was trained on the nli/qnli dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-qnli ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-qnli ### Model Description : An adapter for the roberta-base model that was trained on the nli/qnli dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-qqp
https://huggingface.co/AdapterHub/roberta-base-pf-qqp
An adapter for the roberta-base model that was trained on the sts/qqp dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-qqp ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-qqp ### Model Description : An adapter for the roberta-base model that was trained on the sts/qqp dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-quail
https://huggingface.co/AdapterHub/roberta-base-pf-quail
An adapter for the roberta-base model that was trained on the quail dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-quail ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-quail ### Model Description : An adapter for the roberta-base model that was trained on the quail dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-quartz
https://huggingface.co/AdapterHub/roberta-base-pf-quartz
An adapter for the roberta-base model that was trained on the quartz dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-quartz ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-quartz ### Model Description : An adapter for the roberta-base model that was trained on the quartz dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-quoref
https://huggingface.co/AdapterHub/roberta-base-pf-quoref
An adapter for the roberta-base model that was trained on the quoref dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-quoref ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-quoref ### Model Description : An adapter for the roberta-base model that was trained on the quoref dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-race
https://huggingface.co/AdapterHub/roberta-base-pf-race
An adapter for the roberta-base model that was trained on the rc/race dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-race ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-race ### Model Description : An adapter for the roberta-base model that was trained on the rc/race dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-record
https://huggingface.co/AdapterHub/roberta-base-pf-record
An adapter for the roberta-base model that was trained on the rc/record dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-record ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-record ### Model Description : An adapter for the roberta-base model that was trained on the rc/record dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-rotten_tomatoes
https://huggingface.co/AdapterHub/roberta-base-pf-rotten_tomatoes
An adapter for the roberta-base model that was trained on the sentiment/rotten_tomatoes dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-rotten_tomatoes ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-rotten_tomatoes ### Model Description : An adapter for the roberta-base model that was trained on the sentiment/rotten_tomatoes dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-rte
https://huggingface.co/AdapterHub/roberta-base-pf-rte
An adapter for the roberta-base model that was trained on the nli/rte dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-rte ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-rte ### Model Description : An adapter for the roberta-base model that was trained on the nli/rte dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-scicite
https://huggingface.co/AdapterHub/roberta-base-pf-scicite
An adapter for the roberta-base model that was trained on the scicite dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-scicite ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-scicite ### Model Description : An adapter for the roberta-base model that was trained on the scicite dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-scitail
https://huggingface.co/AdapterHub/roberta-base-pf-scitail
An adapter for the roberta-base model that was trained on the nli/scitail dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-scitail ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-scitail ### Model Description : An adapter for the roberta-base model that was trained on the nli/scitail dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-sick
https://huggingface.co/AdapterHub/roberta-base-pf-sick
An adapter for the roberta-base model that was trained on the nli/sick dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-sick ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-sick ### Model Description : An adapter for the roberta-base model that was trained on the nli/sick dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-snli
https://huggingface.co/AdapterHub/roberta-base-pf-snli
An adapter for the roberta-base model that was trained on the snli dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-snli ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-snli ### Model Description : An adapter for the roberta-base model that was trained on the snli dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-social_i_qa
https://huggingface.co/AdapterHub/roberta-base-pf-social_i_qa
An adapter for the roberta-base model that was trained on the social_i_qa dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-social_i_qa ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-social_i_qa ### Model Description : An adapter for the roberta-base model that was trained on the social_i_qa dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-squad
https://huggingface.co/AdapterHub/roberta-base-pf-squad
An adapter for the roberta-base model that was trained on the qa/squad1 dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-squad ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-squad ### Model Description : An adapter for the roberta-base model that was trained on the qa/squad1 dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-squad_v2
https://huggingface.co/AdapterHub/roberta-base-pf-squad_v2
An adapter for the roberta-base model that was trained on the qa/squad2 dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-squad_v2 ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-squad_v2 ### Model Description : An adapter for the roberta-base model that was trained on the qa/squad2 dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-sst2
https://huggingface.co/AdapterHub/roberta-base-pf-sst2
An adapter for the roberta-base model that was trained on the sentiment/sst-2 dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-sst2 ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-sst2 ### Model Description : An adapter for the roberta-base model that was trained on the sentiment/sst-2 dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-stsb
https://huggingface.co/AdapterHub/roberta-base-pf-stsb
An adapter for the roberta-base model that was trained on the sts/sts-b dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-stsb ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-stsb ### Model Description : An adapter for the roberta-base model that was trained on the sts/sts-b dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-swag
https://huggingface.co/AdapterHub/roberta-base-pf-swag
An adapter for the roberta-base model that was trained on the swag dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-swag ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-swag ### Model Description : An adapter for the roberta-base model that was trained on the swag dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-trec
https://huggingface.co/AdapterHub/roberta-base-pf-trec
An adapter for the roberta-base model that was trained on the trec dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-trec ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-trec ### Model Description : An adapter for the roberta-base model that was trained on the trec dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-ud_deprel
https://huggingface.co/AdapterHub/roberta-base-pf-ud_deprel
An adapter for the roberta-base model that was trained on the deprel/ud_ewt dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-ud_deprel ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-ud_deprel ### Model Description : An adapter for the roberta-base model that was trained on the deprel/ud_ewt dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-ud_en_ewt
https://huggingface.co/AdapterHub/roberta-base-pf-ud_en_ewt
An adapter for the roberta-base model that was trained on the dp/ud_ewt dataset and includes a prediction head for dependency parsing. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: This adapter was trained using adapter-transformer's example script for dependency parsing. See https://github.com/Adapter-Hub/adapter-transformers/tree/master/examples/dependency-parsing. Scores achieved by dependency parsing adapters on the test set of UD English EWT after training:
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-ud_en_ewt ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-ud_en_ewt ### Model Description : An adapter for the roberta-base model that was trained on the dp/ud_ewt dataset and includes a prediction head for dependency parsing. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: This adapter was trained using adapter-transformer's example script for dependency parsing. See https://github.com/Adapter-Hub/adapter-transformers/tree/master/examples/dependency-parsing. Scores achieved by dependency parsing adapters on the test set of UD English EWT after training:
AdapterHub/roberta-base-pf-ud_pos
https://huggingface.co/AdapterHub/roberta-base-pf-ud_pos
An adapter for the roberta-base model that was trained on the pos/ud_ewt dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-ud_pos ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-ud_pos ### Model Description : An adapter for the roberta-base model that was trained on the pos/ud_ewt dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-wic
https://huggingface.co/AdapterHub/roberta-base-pf-wic
An adapter for the roberta-base model that was trained on the wordsence/wic dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-wic ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-wic ### Model Description : An adapter for the roberta-base model that was trained on the wordsence/wic dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-wikihop
https://huggingface.co/AdapterHub/roberta-base-pf-wikihop
An adapter for the roberta-base model that was trained on the qa/wikihop dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-wikihop ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-wikihop ### Model Description : An adapter for the roberta-base model that was trained on the qa/wikihop dataset and includes a prediction head for question answering. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-winogrande
https://huggingface.co/AdapterHub/roberta-base-pf-winogrande
An adapter for the roberta-base model that was trained on the comsense/winogrande dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-winogrande ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-winogrande ### Model Description : An adapter for the roberta-base model that was trained on the comsense/winogrande dataset and includes a prediction head for multiple choice. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-wnut_17
https://huggingface.co/AdapterHub/roberta-base-pf-wnut_17
An adapter for the roberta-base model that was trained on the wnut_17 dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-wnut_17 ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-wnut_17 ### Model Description : An adapter for the roberta-base model that was trained on the wnut_17 dataset and includes a prediction head for tagging. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
AdapterHub/roberta-base-pf-yelp_polarity
https://huggingface.co/AdapterHub/roberta-base-pf-yelp_polarity
An adapter for the roberta-base model that was trained on the yelp_polarity dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdapterHub/roberta-base-pf-yelp_polarity ### Model URL : https://huggingface.co/AdapterHub/roberta-base-pf-yelp_polarity ### Model Description : An adapter for the roberta-base model that was trained on the yelp_polarity dataset and includes a prediction head for classification. This adapter was created for usage with the adapter-transformers library. First, install adapter-transformers: Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More Now, the adapter can be loaded and activated like this: The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here. Refer to the paper for more information on results. If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
Adarsh123/distilbert-base-uncased-finetuned-ner
https://huggingface.co/Adarsh123/distilbert-base-uncased-finetuned-ner
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Adarsh123/distilbert-base-uncased-finetuned-ner ### Model URL : https://huggingface.co/Adarsh123/distilbert-base-uncased-finetuned-ner ### Model Description : No model card New: Create and edit this model card directly on the website!
Addixz/Sanyx
https://huggingface.co/Addixz/Sanyx
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Addixz/Sanyx ### Model URL : https://huggingface.co/Addixz/Sanyx ### Model Description : No model card New: Create and edit this model card directly on the website!
Adharsh2608/DialoGPT-small-harrypotter
https://huggingface.co/Adharsh2608/DialoGPT-small-harrypotter
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Adharsh2608/DialoGPT-small-harrypotter ### Model URL : https://huggingface.co/Adharsh2608/DialoGPT-small-harrypotter ### Model Description : No model card New: Create and edit this model card directly on the website!
AdharshJolly/HarryPotterBot-Model
https://huggingface.co/AdharshJolly/HarryPotterBot-Model
null
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdharshJolly/HarryPotterBot-Model ### Model URL : https://huggingface.co/AdharshJolly/HarryPotterBot-Model ### Model Description :
Adi2K/Priv-Consent
https://huggingface.co/Adi2K/Priv-Consent
You can use cURL to access this model: Or Python API:
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Adi2K/Priv-Consent ### Model URL : https://huggingface.co/Adi2K/Priv-Consent ### Model Description : You can use cURL to access this model: Or Python API:
AdiShenoy0807/DialoGPT-medium-joshua
https://huggingface.co/AdiShenoy0807/DialoGPT-medium-joshua
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdiShenoy0807/DialoGPT-medium-joshua ### Model URL : https://huggingface.co/AdiShenoy0807/DialoGPT-medium-joshua ### Model Description : No model card New: Create and edit this model card directly on the website!
Adielcane/Adiel
https://huggingface.co/Adielcane/Adiel
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Adielcane/Adiel ### Model URL : https://huggingface.co/Adielcane/Adiel ### Model Description : No model card New: Create and edit this model card directly on the website!
Adielcane/Adielcane
https://huggingface.co/Adielcane/Adielcane
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Adielcane/Adielcane ### Model URL : https://huggingface.co/Adielcane/Adielcane ### Model Description : No model card New: Create and edit this model card directly on the website!
Adil617/wav2vec2-base-timit-demo-colab
https://huggingface.co/Adil617/wav2vec2-base-timit-demo-colab
This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set: More information needed More information needed More information needed The following hyperparameters were used during training:
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Adil617/wav2vec2-base-timit-demo-colab ### Model URL : https://huggingface.co/Adil617/wav2vec2-base-timit-demo-colab ### Model Description : This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set: More information needed More information needed More information needed The following hyperparameters were used during training:
Adinda/Adinda
https://huggingface.co/Adinda/Adinda
null
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Adinda/Adinda ### Model URL : https://huggingface.co/Adinda/Adinda ### Model Description :
Adityanawal/testmodel_1
https://huggingface.co/Adityanawal/testmodel_1
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Adityanawal/testmodel_1 ### Model URL : https://huggingface.co/Adityanawal/testmodel_1 ### Model Description : No model card New: Create and edit this model card directly on the website!
Adnan/UrduNewsHeadlines
https://huggingface.co/Adnan/UrduNewsHeadlines
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Adnan/UrduNewsHeadlines ### Model URL : https://huggingface.co/Adnan/UrduNewsHeadlines ### Model Description : No model card New: Create and edit this model card directly on the website!
AdrianGzz/DialoGPT-small-harrypotter
https://huggingface.co/AdrianGzz/DialoGPT-small-harrypotter
null
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AdrianGzz/DialoGPT-small-harrypotter ### Model URL : https://huggingface.co/AdrianGzz/DialoGPT-small-harrypotter ### Model Description :
Adrianaforididk/Jinx
https://huggingface.co/Adrianaforididk/Jinx
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Adrianaforididk/Jinx ### Model URL : https://huggingface.co/Adrianaforididk/Jinx ### Model Description : No model card New: Create and edit this model card directly on the website!
Advertisement/FischlUWU
https://huggingface.co/Advertisement/FischlUWU
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Advertisement/FischlUWU ### Model URL : https://huggingface.co/Advertisement/FischlUWU ### Model Description : No model card New: Create and edit this model card directly on the website!
Aero/Tsubomi-Haruno
https://huggingface.co/Aero/Tsubomi-Haruno
null
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Aero/Tsubomi-Haruno ### Model URL : https://huggingface.co/Aero/Tsubomi-Haruno ### Model Description :
Aeroxas/Botroxas-small
https://huggingface.co/Aeroxas/Botroxas-small
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Aeroxas/Botroxas-small ### Model URL : https://huggingface.co/Aeroxas/Botroxas-small ### Model Description : No model card New: Create and edit this model card directly on the website!
Aeskybunnie/Me
https://huggingface.co/Aeskybunnie/Me
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Aeskybunnie/Me ### Model URL : https://huggingface.co/Aeskybunnie/Me ### Model Description : No model card New: Create and edit this model card directly on the website!
AetherIT/DialoGPT-small-Hal
https://huggingface.co/AetherIT/DialoGPT-small-Hal
#HAL
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AetherIT/DialoGPT-small-Hal ### Model URL : https://huggingface.co/AetherIT/DialoGPT-small-Hal ### Model Description : #HAL
AethiQs-Max/AethiQs_GemBERT_bertje_50k
https://huggingface.co/AethiQs-Max/AethiQs_GemBERT_bertje_50k
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AethiQs-Max/AethiQs_GemBERT_bertje_50k ### Model URL : https://huggingface.co/AethiQs-Max/AethiQs_GemBERT_bertje_50k ### Model Description : No model card New: Create and edit this model card directly on the website!
AethiQs-Max/aethiqs-base_bertje-data_rotterdam-epochs_10
https://huggingface.co/AethiQs-Max/aethiqs-base_bertje-data_rotterdam-epochs_10
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AethiQs-Max/aethiqs-base_bertje-data_rotterdam-epochs_10 ### Model URL : https://huggingface.co/AethiQs-Max/aethiqs-base_bertje-data_rotterdam-epochs_10 ### Model Description : No model card New: Create and edit this model card directly on the website!
AethiQs-Max/aethiqs-base_bertje-data_rotterdam-epochs_30-epoch_30
https://huggingface.co/AethiQs-Max/aethiqs-base_bertje-data_rotterdam-epochs_30-epoch_30
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AethiQs-Max/aethiqs-base_bertje-data_rotterdam-epochs_30-epoch_30 ### Model URL : https://huggingface.co/AethiQs-Max/aethiqs-base_bertje-data_rotterdam-epochs_30-epoch_30 ### Model Description : No model card New: Create and edit this model card directly on the website!
AethiQs-Max/cross_encoder
https://huggingface.co/AethiQs-Max/cross_encoder
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AethiQs-Max/cross_encoder ### Model URL : https://huggingface.co/AethiQs-Max/cross_encoder ### Model Description : No model card New: Create and edit this model card directly on the website!
AethiQs-Max/s3-v1-20_epochs
https://huggingface.co/AethiQs-Max/s3-v1-20_epochs
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AethiQs-Max/s3-v1-20_epochs ### Model URL : https://huggingface.co/AethiQs-Max/s3-v1-20_epochs ### Model Description : No model card New: Create and edit this model card directly on the website!
Aftabhussain/Tomato_Leaf_Classifier
https://huggingface.co/Aftabhussain/Tomato_Leaf_Classifier
Autogenerated by HuggingPics🤗🖼️ Create your own image classifier for anything by running the demo on Google Colab. Report any issues with the demo at the github repo.
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Aftabhussain/Tomato_Leaf_Classifier ### Model URL : https://huggingface.co/Aftabhussain/Tomato_Leaf_Classifier ### Model Description : Autogenerated by HuggingPics🤗🖼️ Create your own image classifier for anything by running the demo on Google Colab. Report any issues with the demo at the github repo.
Ahda/M
https://huggingface.co/Ahda/M
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Ahda/M ### Model URL : https://huggingface.co/Ahda/M ### Model Description : No model card New: Create and edit this model card directly on the website!
Ahmad/parsT5-base
https://huggingface.co/Ahmad/parsT5-base
A monolingual T5 model for Persian trained on OSCAR 21.09 (https://oscar-corpus.com/) corpus with self-supervised method. 35 Gig deduplicated version of Persian data was used for pre-training the model. It's similar to the English T5 model but just for Persian. You may need to fine-tune it on your specific task. Example code: Steps: 725000 Accuracy: 0.66 To train the model further please refer to its github repository at: https://github.com/puraminy/parsT5
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Ahmad/parsT5-base ### Model URL : https://huggingface.co/Ahmad/parsT5-base ### Model Description : A monolingual T5 model for Persian trained on OSCAR 21.09 (https://oscar-corpus.com/) corpus with self-supervised method. 35 Gig deduplicated version of Persian data was used for pre-training the model. It's similar to the English T5 model but just for Persian. You may need to fine-tune it on your specific task. Example code: Steps: 725000 Accuracy: 0.66 To train the model further please refer to its github repository at: https://github.com/puraminy/parsT5
Ahmad/parsT5
https://huggingface.co/Ahmad/parsT5
A checkpoint for training Persian T5 model. This repository can be cloned and pre-training can be resumed. This model uses flax and is for training. For more information and getting the training code please refer to: https://github.com/puraminy/parsT5
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Ahmad/parsT5 ### Model URL : https://huggingface.co/Ahmad/parsT5 ### Model Description : A checkpoint for training Persian T5 model. This repository can be cloned and pre-training can be resumed. This model uses flax and is for training. For more information and getting the training code please refer to: https://github.com/puraminy/parsT5
Ahmadatiya97/Alannah
https://huggingface.co/Ahmadatiya97/Alannah
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Ahmadatiya97/Alannah ### Model URL : https://huggingface.co/Ahmadatiya97/Alannah ### Model Description : No model card New: Create and edit this model card directly on the website!
Ahmadvakili/A
https://huggingface.co/Ahmadvakili/A
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Ahmadvakili/A ### Model URL : https://huggingface.co/Ahmadvakili/A ### Model Description : No model card New: Create and edit this model card directly on the website!
Ahmed59/Demo-Team-5-SIAD
https://huggingface.co/Ahmed59/Demo-Team-5-SIAD
No model card New: Create and edit this model card directly on the website!
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : Ahmed59/Demo-Team-5-SIAD ### Model URL : https://huggingface.co/Ahmed59/Demo-Team-5-SIAD ### Model Description : No model card New: Create and edit this model card directly on the website!
AhmedBou/TuniBert
https://huggingface.co/AhmedBou/TuniBert
This is a fineTued Bert model on Tunisian dialect text (Used dataset: AhmedBou/Tunisian-Dialect-Corpus), ready for sentiment analysis and classification tasks. LABEL_1: Positive LABEL_2: Negative LABEL_0: Neutral This work is an integral component of my Master's degree thesis and represents the culmination of extensive research and labor. If you wish to utilize the Tunisian-Dialect-Corpus or the TuniBert model, kindly refer to the directory provided. [huggingface.co/AhmedBou][github.com/BoulahiaAhmed]
Indicators looking for configurations to recommend AI models for configuring AI agents ### Model Name : AhmedBou/TuniBert ### Model URL : https://huggingface.co/AhmedBou/TuniBert ### Model Description : This is a fineTued Bert model on Tunisian dialect text (Used dataset: AhmedBou/Tunisian-Dialect-Corpus), ready for sentiment analysis and classification tasks. LABEL_1: Positive LABEL_2: Negative LABEL_0: Neutral This work is an integral component of my Master's degree thesis and represents the culmination of extensive research and labor. If you wish to utilize the Tunisian-Dialect-Corpus or the TuniBert model, kindly refer to the directory provided. [huggingface.co/AhmedBou][github.com/BoulahiaAhmed]