File size: 2,278 Bytes
7c9e9f6 d92cc25 2a67936 5c03833 cfce0cf af22a40 7c9e9f6 d31ad75 7c9e9f6 ead6dda 7c9e9f6 ead6dda 7c9e9f6 ead6dda 7c9e9f6 ead6dda 7c9e9f6 ead6dda af22a40 ead6dda 7c9e9f6 37e7aeb ead6dda 7c9e9f6 fd1c4fa ead6dda af22a40 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 |
---
license: mit
base_model: cardiffnlp/twitter-roberta-base-sentiment-latest
language:
- en
library_name: transformers
tags:
- Roberta
- Sentiment Analysis
widget:
- text: This product is really great!
- text: This product is really bad!
---
# π Fine-tuned RoBERTa for Sentiment Analysis on Reviews π
This is a fine-tuned version of [cardiffnlp/twitter-roberta-base-sentiment-latest](https://huggingface.co/cardiffnlp/twitter-roberta-base-sentiment-latest) on the [Amazon Reviews dataset](https://www.kaggle.com/datasets/bittlingmayer/amazonreviews) for sentiment analysis.
## π Model Details
- **π Model Name:** `AnkitAI/reviews-roberta-base-sentiment-analysis`
- **π Base Model:** `cardiffnlp/twitter-roberta-base-sentiment-latest`
- **π Dataset:** [Amazon Reviews](https://www.kaggle.com/datasets/bittlingmayer/amazonreviews)
- **βοΈ Fine-tuning:** This model was fine-tuned for sentiment analysis with a classification head for binary sentiment classification (positive and negative).
## ποΈ Training
The model was trained using the following parameters:
- **π§ Learning Rate:** 2e-5
- **π¦ Batch Size:** 16
- **βοΈ Weight Decay:** 0.01
- **π
Evaluation Strategy:** Epoch
### ποΈ Training Details
- **π Eval Loss:** 0.1049
- **β±οΈ Eval Runtime:** 3177.538 seconds
- **π Eval Samples/Second:** 226.591
- **π Eval Steps/Second:** 7.081
- **π Train Runtime:** 110070.6349 seconds
- **π Train Samples/Second:** 78.495
- **π Train Steps/Second:** 2.453
- **π Train Loss:** 0.0858
- **β³ Eval Accuracy:** 97.19%
- **π Eval Precision:** 97.9%
- **β±οΈ Eval Recall:** 97.18%
- **π Eval F1 Score:** 97.19%
## π Usage
You can use this model directly with the Hugging Face `transformers` library:
```python
from transformers import RobertaForSequenceClassification, RobertaTokenizer
model_name = "AnkitAI/reviews-roberta-base-sentiment-analysis"
model = RobertaForSequenceClassification.from_pretrained(model_name)
tokenizer = RobertaTokenizer.from_pretrained(model_name)
# Example usage
inputs = tokenizer("This product is great!", return_tensors="pt")
outputs = model(**inputs) # 1 for positive, 0 for negative
```
## π License
This model is licensed under the [MIT License](LICENSE). |