Edit model card

finetuned-distilbert-news-article-catgorization

This model is a fine-tuned version of distilbert-base-uncased on the news_article_categorization dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1548
  • F1_score(weighted): 0.96

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

The model was trained on some subset of the news_article_categorization dataset and it was validated on the remaining subset of the data

Training procedure

More information needed

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-5
  • train_batch_size: 3
  • eval_batch_size: 3
  • seed: 17
  • optimizer: AdamW(lr=1e-5 and epsilon=1e-08)
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 0
  • num_epochs: 2

Training results

Training Loss Epoch Validation Loss f1 score
0.6359 1.0 0.1739 0.9619
0.1548 2.0 0.1898 0.9648
Downloads last month
878

Spaces using valurank/finetuned-distilbert-news-article-categorization 2