--- inference: false language: pt datasets: - ruanchaves/hatebr --- # BERTimbau base for Offensive Language Detection This is the [neuralmind/bert-base-portuguese-cased](https://huggingface.co/neuralmind/bert-base-portuguese-cased) model finetuned for Offensive Language Detection with the [HateBR](https://huggingface.co/ruanchaves/hatebr) dataset. This model is suitable for Portuguese. - Git Repo: [Evaluation of Portuguese Language Models](https://github.com/ruanchaves/eplm). - Demo: [Hugging Face Space: Offensive Language Detection](https://ruanchaves-portuguese-offensive-language-de-d4d0507.hf.space) ### **Labels**: * 0 : The text is not offensive. * 1 : The text is offensive. ## Full classification example ```python from transformers import AutoModelForSequenceClassification, AutoTokenizer, AutoConfig import numpy as np import torch from scipy.special import softmax model_name = "ruanchaves/bert-base-portuguese-cased-hatebr" s1 = "Quem não deve não teme!!" model = AutoModelForSequenceClassification.from_pretrained(model_name) tokenizer = AutoTokenizer.from_pretrained(model_name) config = AutoConfig.from_pretrained(model_name) model_input = tokenizer(*([s1],), padding=True, return_tensors="pt") with torch.no_grad(): output = model(**model_input) scores = output[0][0].detach().numpy() scores = softmax(scores) ranking = np.argsort(scores) ranking = ranking[::-1] for i in range(scores.shape[0]): l = config.id2label[ranking[i]] s = scores[ranking[i]] print(f"{i+1}) Label: {l} Score: {np.round(float(s), 4)}") ``` ## Citation Our research is ongoing, and we are currently working on describing our experiments in a paper, which will be published soon. In the meanwhile, if you would like to cite our work or models before the publication of the paper, please cite our [GitHub repository](https://github.com/ruanchaves/eplm): ``` @software{Chaves_Rodrigues_eplm_2023, author = {Chaves Rodrigues, Ruan and Tanti, Marc and Agerri, Rodrigo}, doi = {10.5281/zenodo.7781848}, month = {3}, title = , url = {https://github.com/ruanchaves/eplm}, version = {1.0.0}, year = {2023} } ```