Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Model Details

This model is a Binary classification model fine-tuned on the Fake and Real News Dataset using the BERT (bert-base-uncased) architecture. The primary task is to classify news articles into different categories, making it suitable for fake news detection. BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based model known for its effectiveness in natural language processing tasks.

It takes the title of the news article and classifies it into Reliable or Unreliable news.

Bias: The model may inherit biases present in the training data, and it's important to be aware of potential biases in the predictions.

Code Implementation

# Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification

tokenizer = AutoTokenizer.from_pretrained("Arjun24420/BERT-FakeOrReal-BinaryClassification")
model = AutoModelForSequenceClassification.from_pretrained("Arjun24420/BERT-FakeOrReal-BinaryClassification")

def predict(text):
    # Tokenize the input text and move tensors to the GPU if available
    inputs = tokenizer(text, padding=True, truncation=True,
                       max_length=512, return_tensors="pt")

    # Get model output (logits)
    outputs = model(**inputs)

    probs = outputs.logits.softmax(1)
    # Get the probabilities for each class
    class_probabilities = {class_mapping[i]: probs[0, i].item()
                           for i in range(probs.shape[1])}

    return class_probabilities

# Define class labels mapping
class_mapping = {
    1: 'Reliable',
    0: 'Unreliable',
}
Downloads last month
138
Safetensors
Model size
109M params
Tensor type
F32
·