Edit model card

bert-base-uncased-ag-news

Model description

bert-base-uncased finetuned on the AG News dataset using PyTorch Lightning. Sequence length 128, learning rate 2e-5, batch size 32, 4 T4 GPUs, 4 epochs. The code can be found here

Limitations and bias

  • Not the best model...

Training data

Data came from HuggingFace's datasets package. The data can be viewed on nlp viewer.

Training procedure

...

Eval results

...

Downloads last month
142
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train nateraw/bert-base-uncased-ag-news

Spaces using nateraw/bert-base-uncased-ag-news 2