Edit model card

Chakshu/conversation_terminator_classifier

This model is a fine-tuned version of google/mobilebert-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0364
  • Train Binary Accuracy: 0.9915
  • Epoch: 8

Example Usage

from transformers import AutoTokenizer, TFBertForSequenceClassification, BertTokenizer
import tensorflow as tf

model_name = 'Chakshu/conversation_terminator_classifier' 

tokenizer = BertTokenizer.from_pretrained(model_name)
model = TFBertForSequenceClassification.from_pretrained(model_name)
inputs = tokenizer("I will talk to you later", return_tensors="np", padding=True)
outputs = model(inputs.input_ids, inputs.attention_mask)
probabilities = tf.nn.sigmoid(outputs.logits)

# Round the probabilities to the nearest integer to get the class prediction
predicted_class = tf.round(probabilities)
print("The last message by the user indicates that the conversation has", "'ENDED'" if int(predicted_class.numpy()) == 1 else "'NOT ENDED'")

Model description

Classifies if the user is ending the conversation or wanting to continue it.

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': False, 'is_legacy_optimizer': False, 'learning_rate': 2e-05, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Train Binary Accuracy Epoch
0.2552 0.9444 0
0.1295 0.9872 1
0.0707 0.9872 2
0.0859 0.9829 3
0.0484 0.9872 4
0.0363 0.9957 5
0.0209 1.0 6
0.0268 0.9957 7
0.0364 0.9915 8

Framework versions

  • Transformers 4.28.0
  • TensorFlow 2.12.0
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
18
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train Chakshu/conversation_terminator_classifier