File size: 1,013 Bytes
181e774 0addb30 181e774 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 |
# hyy33 at WASSA 2024 Track 2
This repository presents our scripts and models for Track 2 in WASSA 2024 of ACL 2024.
Pre-trained BERT and DeBERTa models are finetuned using the CombinedLoss and FGM adversarial training for Emotion and Empathy Prediction from Conversation Turns.
## Publication
ACL 2024 Workshop WASSA Shared Task:
hyy33 at WASSA 2024 Empathy and Personality Shared Task: Using the CombinedLoss and FGM for Enhancing BERT-based Models in Emotion and Empathy Prediction from Conversation Turns
## Scripts
Scripts for fine-tuning BERT and DeBERTa in downstream classification and regression tasks.
[https://github.com/hyy-33/hyy33-WASSA-2024-Track-2/tree/main](https://github.com/hyy-33/hyy33-WASSA-2024-Track-2/tree/main)
## Models
The fine-tuned models we submitted for the final result on Track 2, which achieved Pearson correlation of 0.581 for Emotion, 0.644 for Emotional Polarity and 0.544 for Empathy on the test set, with the average value of 0.590 which ranked 4th among all teams.
|