Back to all models
text-classification mask_token: [MASK]
Query this model
🔥 This model is currently loaded and running on the Inference API. ⚠️ This model could not be loaded by the inference API. ⚠️ This model can be loaded on the Inference API on-demand.
JSON Output
API endpoint  

⚡️ Upgrade your account to access the Inference API

							$
							curl -X POST \
-H "Authorization: Bearer YOUR_ORG_OR_USER_API_TOKEN" \
-H "Content-Type: application/json" \
-d '"json encoded string"' \
https://api-inference.huggingface.co/models/Hate-speech-CNERG/dehatebert-mono-spanish
Share Copied link to clipboard

Monthly model downloads

Hate-speech-CNERG/dehatebert-mono-spanish Hate-speech-CNERG/dehatebert-mono-spanish
46 downloads
last 30 days

pytorch

tf

Contributed by

Hate-ALERT university
4 team members · 9 models

How to use this model directly from the 🤗/transformers library:

			
Copy to clipboard
from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("Hate-speech-CNERG/dehatebert-mono-spanish") model = AutoModelForSequenceClassification.from_pretrained("Hate-speech-CNERG/dehatebert-mono-spanish")

This model is used detecting hatespeech in Spanish language. The mono in the name refers to the monolingual setting, where the model is trained using only English language data. It is finetuned on multilingual bert model. The model is trained with different learning rates and the best validation score achieved is 0.740287 for a learning rate of 3e-5. Training code can be found at this url

For more details about our paper

Sai Saketh Aluru, Binny Mathew, Punyajoy Saha and Animesh Mukherjee. "Deep Learning Models for Multilingual Hate Speech Detection". Accepted at ECML-PKDD 2020.

Please cite our paper in any published work that uses any of these resources.

@article{aluru2020deep,
  title={Deep Learning Models for Multilingual Hate Speech Detection},
  author={Aluru, Sai Saket and Mathew, Binny and Saha, Punyajoy and Mukherjee, Animesh},
  journal={arXiv preprint arXiv:2004.06465},
  year={2020}
}