File size: 1,795 Bytes
241a513
e0c9d44
241a513
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
## Model Details
This model is a Binary classification model fine-tuned on the Fake and Real News Dataset using the BERT (bert-base-cased) architecture. The primary task is to classify news articles into different categories, making it suitable for fake news detection. BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based model known for its effectiveness in natural language processing tasks.

It takes the title of the news article as well as the entire article itself. The model then classifies it into Reliable or Unreliable news.
```python
text_input = "<title>" + headline + "<content>" + article + "<end>"
```
Bias: The model may inherit biases present in the training data, and it's important to be aware of potential biases in the predictions.

## Code Implementation
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification

# Load model directly
tokenizer = AutoTokenizer.from_pretrained(
    "Arjun24420/FakeNews-BERT-base-cased")
model = AutoModelForSequenceClassification.from_pretrained(
    "Arjun24420/FakeNews-BERT-base-cased")

def predict(text):
    # Tokenize the input text and move tensors to the GPU if available
    inputs = tokenizer(text, padding=True, truncation=True,
                       max_length=512, return_tensors="pt")

    # Get model output (logits)
    outputs = model(**inputs)

    probs = outputs.logits.softmax(1)
    # Get the probabilities for each class
    class_probabilities = {class_mapping[i]: probs[0, i].item()
                           for i in range(probs.shape[1])}

    return class_probabilities

text = "<title>" + headline + "<content>" + article + "<end>"
predict(text)
# Define class labels mapping
class_mapping = {
    1: 'Reliable',
    0: 'Unreliable',
}
   

```