Instructions to use textattack/distilbert-base-uncased-MRPC with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use textattack/distilbert-base-uncased-MRPC with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="textattack/distilbert-base-uncased-MRPC")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("textattack/distilbert-base-uncased-MRPC") model = AutoModelForSequenceClassification.from_pretrained("textattack/distilbert-base-uncased-MRPC") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 46867b746ef0a3b2e34dcbd356cee51266a86d64dc561ddaecba08ecf69631ba
- Size of remote file:
- 1.06 kB
- SHA256:
- 5b5ad2db64b7bfe359a6ca4ef52d018804a4231915fb823668d500f521a96e64
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.