SWOT Analysis Model based on DistilBERT
This repository hosts a fine-tuned version of distilbert-base-uncased
, specifically trained to classify SWOT elements (Strength, Weakness, Opportunity, Threat) in Amazon product reviews of smartphones. This model serves as a "Synthetic Expert", with annotations derived from a combination of GPT-4 generated labels and human labeling.
Model Training and Data
- Base Model:
distilbert-base-uncased
- Dataset: 9,545 Amazon product reviews.
- Annotations:
- GPT-4 generated labels for 9,045 reviews.
- Human-labeled data for 500 reviews as a baseline.
- Annotations:
- Task: Multi-label classification of SWOT elements.
How to Use
This model can be directly loaded via the Hugging Face Transformers library:
from transformers import AutoModelForSequenceClassification, AutoTokenizer
# Import model and tokenizer
model = AutoModelForSequenceClassification.from_pretrained('jcaponigro/SWOT_Classifier')
tokenizer = AutoTokenizer.from_pretrained('jcaponigro/SWOT_Classifier')
# Example of model usage
text = "Your text for SWOT analysis."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
- Downloads last month
- 6