Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Description

This model is Part of the NLP assignment for Fatima Fellowship.

This model is a fine-tuned version of 'bert-base-uncased' on the below dataset: Fake News Dataset.

It achieves the following results on the evaluation set:

  • Accuracy: 0.995
  • Precision: 0.995
  • Recall: 0.995
  • F_score: 0.995

Labels

Fake news: 0

Real news: 1

Using this model in your code

To use this model, first download it from the hugging face website:


import transformers
from transformers import AutoTokenizer
class Fake_Real_Model_Arch_test(transformers.PreTrainedModel):
    def __init__(self,bert):
        super(Fake_Real_Model_Arch_test,self).__init__(config=AutoConfig.from_pretrained(MODEL_NAME))
        
        self.bert = bert        
        num_classes = 2  # number of targets to predict
        embedding_dim = 768   # length of embedding dim        
        self.fc1 = nn.Linear(embedding_dim, num_classes)
        self.softmax = nn.Softmax()

    def forward(self, text_id, text_mask):
        outputs= self.bert(text_id, attention_mask=text_mask)        
        outputs = outputs[1]  # get hidden layers       
        logit = self.fc1(outputs)
        return self.softmax(logit)
        
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
model =  Fake_Real_Model_Arch_test(AutoModel.from_pretrained("rematchka/Bert_fake_news_detection"))
Downloads last month
4
Inference API
Unable to determine this model’s pipeline type. Check the docs .