DistilBERT (uncased) for FaceNews Classification

This model is a classification model built by fine-tuning DistilBERT base model. This model was trained using fake-and-real-news-dataset for five epochs.

NOTE: This model is just a POC (proof-of-concept) for a fellowship I was applying for.

Intended uses & limitations

Note that this model is primarily aimed at classifying an article to either "Fake" or "Real".

How to use

Check this notebook on Kaggle.

Downloads last month
20
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train anwarvic/distilbert-base-uncased-for-fakenews