distilbert-finetuned-imdb
This model is a fine-tuned version of distilbert-base-uncased on the imdb dataset. It achieves the following results on the evaluation set:
- Loss: 0.2742
- Accuracy: 0.9321
Model description
More information needed
Intended uses & limitations
The model is fine-tuned for sentiment analysis use cases. It can take a review and classify the review as 'positive' or 'negative'.
Training and evaluation data
The model is fine-tuned with the IMDB dataset which consists of 25000 training records and 25000 testing records. The model is trained and validated on all of them.
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.2686 | 1.0 | 3125 | 0.2484 | 0.9223 |
0.1714 | 2.0 | 6250 | 0.2742 | 0.9321 |
Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for apenev/distilbert-finetuned-imdb
Base model
distilbert/distilbert-base-uncased