Edit model card

UoM&MMU at TSAR-2022 Shared Task - Prompt Learning for Lexical Simplification: prompt-ls-es-1

We present PromptLS, a method for fine-tuning large pre-trained masked language models to perform the task of Lexical Simplification. This model is part of a series of models presented at the TSAR-2022 Shared Task by the University of Manchester and Manchester Metropolitan University (UoM&MMU) Team in English, Spanish and Portuguese. You can find more details about the project in our GitHub.

Models

Our models were fine-tuned using prompt-learning for Lexical Simplification. These are the available models you can use (current model page in bold):

Model Name Run # Language Setting
prompt-ls-en-1 1 English fine-tune
prompt-ls-en-2 2 English fine-tune
roberta-large 3 English zero-shot
prompt-ls-es-1 1 Spanish fine-tune
prompt-ls-es-2 2 Spanish fine-tune
prompt-ls-es-3 3 Spanish fine-tune
prompt-ls-pt-1 1 Portuguese fine-tune
prompt-ls-pt-2 2 Portuguese fine-tune
prompt-ls-pt-3 3 Portuguese fine-tune

For the zero-shot setting, we used the original models with no further training. Links to these models are also updated in the table above.

Results

We include the official results from the competition test set as a reference. However, we encourage the users to also check our results in the development set, which show an increased performance for Spanish and Portuguese. You can find more details in our paper.

Language # Model Setting Prompt1 Prompt2 w k Acc@1 A@3 M@3 P@3
English 1 RoBERTa-L fine-tune simple word 5 5 0.6353 0.5308 0.4244 0.8739
English 2 mBERT multilingual easier word 10 10 0.4959 0.4235 0.3273 0.7560
English 3 RoBERTa-L zero-shot easier word 5 - 0.2654 0.268 0.1820 0.4906
Spanish 1 BERTIN fine-tune sinónimo fácil - 3 0.3451 0.2907 0.2238 0.5543
Spanish 2 BERTIN fine-tune palabra simple - 10 0.3614 0.2907 0.2225 0.538
Spanish 3 BERTIN fine-tune sinónimo fácil 10 10 0.3668 0.269 0.2128 0.5326
Portuguese 1 BR_BERTo fine-tune palavra simples - 8 0.1711 0.1096 0.1011 0.2486
Portuguese 2 BR_BERTo fine-tune sinônimo fácil - 10 0.1363 0.0962 0.0944 0.2379
Portuguese 3 BR_BERTo fine-tune sinônimo simples 5 10 0.1577 0.1283 0.1071 0.2834

Citation

If you use our results and scripts in your research, please cite our work: "UoM&MMU at TSAR-2022 Shared Task: Prompt Learning for Lexical Simplification".

@inproceedings{vasquez-rodriguez-etal-2022-prompt-ls,
    title = "UoM\&MMU at TSAR-2022 Shared Task: Prompt Learning for Lexical Simplification",
    author = "V{\'a}squez-Rodr{\'\i}guez, Laura  and
      Nguyen, Nhung T. H. and
      Shardlow, Matthew and
      Ananiadou, Sophia",
    booktitle = "Shared Task on Text Simplification, Accessibility, and Readability (TSAR-2022), EMNLP 2022",
    month = dec,
    year = "2022",
}
Downloads last month
13
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.