Edit model card

DistilROBERTA fine-tuned for news classification

This model is based on distilroberta-base pretrained weights, with a classification head fine-tuned to classify news articles into 3 categories (bad, medium, good).

Training data

The dataset used to fine-tune the model is news-small, the 300 article news dataset manually annotated by Alex.

Inputs

Similar to its base model, this model accepts inputs with a maximum length of 512 tokens.

Downloads last month
889