Edit model card
YAML Metadata Error: "datasets[0]" with value "Fake News https://www.kaggle.com/datasets/clmentbisaillon/fake-and-real-news-dataset" is not valid. If possible, use a dataset id from https://hf.co/datasets.

Model description:

Distilbert is created with knowledge distillation during the pre-training phase which reduces the size of a BERT model by 40%, while retaining 97% of its language understanding. It's smaller, faster than Bert and any other Bert-based model.

Distilbert-base-uncased finetuned on the fake news dataset with below Hyperparameters

 learning rate 5e-5, 
 batch size 32,
 num_train_epochs=2,

Full code available @ DistilBert-FakeNews

Dataset available @ Fake News dataset

Downloads last month
8
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.