--- language: en --- # bert-base-cased for Advertisement Classification This is bert-base-cased model trained on the binary dataset prepared for advertisement classification. This model is suitable for English. Labels: 0 -> non-advertisement; 1 -> advertisement; ## Example of classification ```python from transformers import AutoModelForSequenceClassification from transformers import AutoTokenizer import numpy as np from scipy.special import softmax text = 'Young Brad Pitt early in his career McDonalds Commercial' encoded_input = tokenizer(text, return_tensors='pt').to('cuda') output = model(**encoded_input) scores = output[0][0].detach().to('cpu').numpy() scores = softmax(scores) prediction_class = np.argmax(scores) print(prediction_class) ``` Output: ``` 1 ```