fake-news-detector / README.md
AlexanderHolmes0's picture
Update README.md
8bf21c4 verified
|
raw
history blame
3.64 kB
metadata
license: apache-2.0
base_model: distilbert/distilbert-base-uncased
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: Fake-News-Detector
    results: []
widget:
  - text: >-
      In a shocking turn of events, reports have surfaced suggesting that a
      clandestine meeting of world leaders took place on Mars to discuss plans
      for the colonization of the Red Planet. According to anonymous sources
      within the highest echelons of government, the summit was organized by a
      coalition of space agencies and private corporations aiming to expedite
      humanity's expansion beyond Earth. The meeting purportedly took place in a
      hidden underground facility on Mars, accessible only to a select few
      individuals privy to the ambitious project.
    example_title: Mars Meeting
  - text: >-
      In a groundbreaking revelation that has sent shockwaves through the
      scientific community, Dr. Rachel Bennett, a renowned researcher at the
      prestigious Cambridge Institute of Biotechnology, claims to have unlocked
      the elusive secret to eternal youth. According to Dr. Bennett, years of
      tireless research have culminated in the discovery of a revolutionary
      anti-aging compound derived from a rare Amazonian plant known only to
      indigenous tribes. Initial trials on laboratory mice have yielded
      astonishing results, with subjects exhibiting signs of reversed aging and
      enhanced vitality.
    example_title: Dr. Bennett
  - text: Apples are orange
    example_title: Oranges are Apples
  - text: Donald Trump is the 45th president of the United States.
    example_title: True News
datasets:
  - AlexanderHolmes0/true-fake-news
language:
  - en
pipeline_tag: text-classification

Fake-News-Detector

This model is a fine-tuned version of distilbert/distilbert-base-uncased on the true-fake-news dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0096
  • Accuracy: 0.9976

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.1809 0.09 100 0.0608 0.9840
0.0433 0.18 200 0.0222 0.9933
0.0248 0.27 300 0.0631 0.9834
0.0246 0.36 400 0.0363 0.9903
0.0223 0.45 500 0.0378 0.9906
0.0172 0.53 600 0.0129 0.9969
0.0133 0.62 700 0.0208 0.9947
0.0188 0.71 800 0.0118 0.9971
0.0134 0.8 900 0.0109 0.9971
0.0055 0.89 1000 0.0096 0.9976
0.0055 0.98 1100 0.0096 0.9976

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.0+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.1