Edit model card

Multilingual Sentiment Analysis Model

This repository contains a fine-tuned sentiment analysis model based on the BERT base multilingual model. The model is trained on a dataset with texts labeled as 'negative', 'neutral', or 'positive'.

Model Description

The model is built using the Hugging Face Transformers library and the 'bert-base-multilingual-uncased' pre-trained model. It is fine-tuned on a sentiment analysis task, allowing it to classify input texts into one of three sentiment categories: negative, neutral, or positive.

Usage

To use this model, you can load it from the Hugging Face Model Hub using the following code:

from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline

model_name = "Akazi/bert-base-multilingual-uncased"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)

classifier = pipeline("text-classification", model=model, tokenizer=tokenizer)
texts = ["This is a great product!", "I'm not sure about this."]
predictions = classifier(texts)
print(predictions)

This code will load the model and tokenizer from the Hugging Face Model Hub, create a text classification pipeline, and use it to classify the provided texts. The output will be a list of dictionaries, where each dictionary contains the predicted label and score for the corresponding input text.

Model Performance

The model was evaluated on a held-out test dataset, and the following metrics were obtained:

  • Accuracy: 0.7061
  • Precision: 0.7038
  • Recall: 0.7040
  • F1-score: 0.7038

Please note that the performance may vary depending on the input data and domain.

Training Data

The model was trained on a dataset consisting of text samples labeled as 'negative', 'neutral', or 'positive'. The dataset was preprocessed by mapping the textual labels to numeric values (0 for 'negative', 1 for 'neutral', and 2 for 'positive').

License

This model is licensed under the MIT License.

Credits

This model was developed by Abdullah Kazi using the Hugging Face Transformers library and the BERT base multilingual model.

Downloads last month
21