t5-standard-ABSA / README.md
trichter's picture
Update README.md
51429bb verified
metadata
language:
  - en
base_model:
  - google-t5/t5-large
library_name: transformers

Model: t5-standard-ABSA

Task: Aspect-Based Sentiment Analysis (ABSA) - specifically, Aspect Pair Sentiment Extraction

Model Description

t5-standard-ABSA is a fine-tuned t5-large model designed to perform Aspect-Based Sentiment Analysis (ABSA), particularly for the task of Aspect Pair Sentiment Extraction.

Dataset

The dataset consisted of customer reviews of mobile apps that were originally unannotated. They were scraped and collected by Martens et al. for their paper titled "On the Emotion of Users in App Reviews". The data was annotated via the OpenAI API and the model gpt-3.5-turbo, with each review labeled for specific aspects (e.g., UI, functionality, performance) and the corresponding sentiment (positive, negative, neutral).

Training was performed using Hugging Face's Trainer API in Google Colaboratory using 1 L4 GPU with 22.5 GB of VRAM. Training took around 3 hours with a cost of about 30 compute units.
All code can be found at my My GitHub Repository

Hyperparameters

Some of the key hyperparameters used for fine-tuning:

Batch Size: 8

Gradient Accumulation Steps: 1

Optimizer: AdamW

Learning Rate: 1e-4

Epochs: 5

Max Sequence Length: 512