File size: 1,298 Bytes
9fea08e 0806b8b 9fea08e 95ee2bc 9fea08e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 |
---
license: apache-2.0
datasets:
- sst2
language:
- en
metrics:
- accuracy
library_name: transformers
pipeline_tag: text-classification
widget:
- text: "this film 's relationship to actual tension is the same as what christmas-tree flocking in a spray can is to actual snow : a poor -- if durable -- imitation ."
- text: "director rob marshall went out gunning to make a great one ."
---
# bert-base-uncased-finetuned-sst2-v2
"bert-base-uncased" finetuned on SST-2.
This model pertains to the "Try it out!" exercise in section 4 of chapter 3 of the Hugging Face "NLP Course" (https://huggingface.co/learn/nlp-course/chapter3/4).
It was trained using a custom PyTorch loop without Hugging Face Accelerate.
Code: https://github.com/sambitmukherjee/hf-nlp-course-exercises/blob/main/chapter3/section4.ipynb
Experiment tracking: https://wandb.ai/sadhaklal/bert-base-uncased-finetuned-sst2-v2
## Usage
```
from transformers import pipeline
classifier = pipeline("text-classification", model="sadhaklal/bert-base-uncased-finetuned-sst2-v2")
print(classifier("uneasy mishmash of styles and genres ."))
print(classifier("by the end of no such thing the audience , like beatrice , has a watchful affection for the monster ."))
```
## Metric
Accuracy on the `'validation'` split of SST-2: 0.9278
|