Back to all models
Model card Files and versions Use in transformers
text-classification mask_token: [MASK]
Query this model
πŸ”₯ This model is currently loaded and running on the Inference API. ⚠️ This model could not be loaded by the inference API. ⚠️ This model can be loaded on the Inference API on-demand.
JSON Output
API endpoint  

⚑️ Upgrade your account to access the Inference API

Share Copied link to clipboard

Contributed by

nateraw Nate Raw
3 models


Model description

bert-base-uncased finetuned on the AG News dataset using PyTorch Lightning. Sequence length 128, learning rate 2e-5, batch size 32, 4 T4 GPUs, 4 epochs. The code can be found here

Limitations and bias

  • Not the best model...

Training data

Data came from HuggingFace's datasets package. The data can be viewed on nlp viewer.

Training procedure


Eval results