Edit model card

Note

This model is training with 30k+ ABSA samples, see ABSADatasets. Yet the test sets are not included in pre-training, so you can use this model for training and benchmarking on common ABSA datasets, e.g., Laptop14, Rest14 datasets. (Except for the Rest15 dataset!)

DeBERTa for aspect-based sentiment analysis

The deberta-v3-large-absa model for aspect-based sentiment analysis, trained with English datasets from ABSADatasets.

Training Model

This model is trained based on the FAST-LCF-BERT model with microsoft/deberta-v3-large, which comes from PyABSA. To track state-of-the-art models, please see PyASBA.

Usage

from transformers import AutoTokenizer, AutoModelForSequenceClassification

tokenizer = AutoTokenizer.from_pretrained("yangheng/deberta-v3-large-absa-v1.1")

model = AutoModelForSequenceClassification.from_pretrained("yangheng/deberta-v3-large-absa-v1.1")

Example in PyASBA

An example for using FAST-LCF-BERT in PyASBA datasets.

Datasets

This model is fine-tuned with 180k examples for the ABSA dataset (including augmented data). Training dataset files:

loading: integrated_datasets/apc_datasets/SemEval/laptop14/Laptops_Train.xml.seg
loading: integrated_datasets/apc_datasets/SemEval/restaurant14/Restaurants_Train.xml.seg
loading: integrated_datasets/apc_datasets/SemEval/restaurant16/restaurant_train.raw
loading: integrated_datasets/apc_datasets/ACL_Twitter/acl-14-short-data/train.raw
loading: integrated_datasets/apc_datasets/MAMS/train.xml.dat
loading: integrated_datasets/apc_datasets/Television/Television_Train.xml.seg
loading: integrated_datasets/apc_datasets/TShirt/Menstshirt_Train.xml.seg
loading: integrated_datasets/apc_datasets/Yelp/yelp.train.txt

If you use this model in your research, please cite our paper:

@article{YangZMT21,
  author    = {Heng Yang and
               Biqing Zeng and
               Mayi Xu and
               Tianxing Wang},
  title     = {Back to Reality: Leveraging Pattern-driven Modeling to Enable Affordable
               Sentiment Dependency Learning},
  journal   = {CoRR},
  volume    = {abs/2110.08604},
  year      = {2021},
  url       = {https://arxiv.org/abs/2110.08604},
  eprinttype = {arXiv},
  eprint    = {2110.08604},
  timestamp = {Fri, 22 Oct 2021 13:33:09 +0200},
  biburl    = {https://dblp.org/rec/journals/corr/abs-2110-08604.bib},
  bibsource = {dblp computer science bibliography, https://dblp.org}
}
Downloads last month
984
Safetensors
Model size
435M params
Tensor type
I64
·
F32
·

Space using yangheng/deberta-v3-large-absa-v1.1 1

Collection including yangheng/deberta-v3-large-absa-v1.1