File size: 744 Bytes
09b5fe4 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
---
language: en
datasets:
- yelp_polarity
---
# RoBERTa-base-finetuned-yelp-polarity
This is a [RoBERTa-base](https://huggingface.co/roberta-base) checkpoint fine-tuned on binary sentiment classifcation from [Yelp polarity](https://huggingface.co/nlp/viewer/?dataset=yelp_polarity).
It gets **98.08%** accuracy on the test set.
## Hyper-parameters
We used the following hyper-parameters to train the model on one GPU:
```python
num_train_epochs = 2.0
learning_rate = 1e-05
weight_decay = 0.0
adam_epsilon = 1e-08
max_grad_norm = 1.0
per_device_train_batch_size = 32
gradient_accumulation_steps = 1
warmup_steps = 3500
seed = 42
```
|