Model Card
Bank ACTION Classifier - DistilBERT Developed by: Richard Chai, https://www.linkedin.com/in/richardchai/
This model has been fine-tuned for Bank User Action/Intent Identification. Currently, it identifies the following actions: ['access', 'activate', 'apply', 'block', 'cancel', 'close', 'deposit', 'dispute', 'earn', 'exchange', 'find', 'inquire', 'link', 'open', 'pay', 'receive', 'redeem', 'refund', 'renew', 'report', 'reset', 'retrieve', 'schedule', 'select', 'transfer', 'unblock', 'unknown', 'unlink', 'update', 'verify', 'withdraw']
Model Details
- Model type: Transformer-based (e.g., BERT, DistilBERT, etc.): DistilBERT
- Dataset: Stanford Sentiment Treebank SST-5 or another sentiment dataset
- Fine-tuning: The model was fine-tuned for X epochs using a learning rate of Y on a dataset with Z samples.
Usage
You can use this model to classify text sentiment as follows:
from transformers import pipeline
# Check if GPU is available
device = 0 if torch.cuda.is_available() else -1
model_checkpt = "richardchai/plp_action_clr_distilbert"
clf = pipeline('text-classification', model="model_trained/distilbert", device=device)
result = clf(f"['please tell me more about your fixed deposit.', 'I want to deposit money into my savings account.']")
print(result)
- Downloads last month
- 3