Text Classification
Transformers
TensorBoard
Safetensors
distilbert
Generated from Trainer
Eval Results (legacy)
text-embeddings-inference
Instructions to use aisuko/ft-distilbert-base-uncased with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use aisuko/ft-distilbert-base-uncased with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="aisuko/ft-distilbert-base-uncased")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("aisuko/ft-distilbert-base-uncased") model = AutoModelForSequenceClassification.from_pretrained("aisuko/ft-distilbert-base-uncased") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- ec64f31b597cda42e98d501dd8e3283458c1613ed82891808b6383140870a0ce
- Size of remote file:
- 4.22 kB
- SHA256:
- c9d91cacdc8246ccd6f4f89662346ba1b48a24d6ce922b50c57f1e22944ee93c
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.