Instructions to use Sayan01/mrpc-Bert with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Sayan01/mrpc-Bert with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="Sayan01/mrpc-Bert")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("Sayan01/mrpc-Bert") model = AutoModelForSequenceClassification.from_pretrained("Sayan01/mrpc-Bert") - Notebooks
- Google Colab
- Kaggle
Training in progress, epoch 9
Browse files
logs/events.out.tfevents.1692605297.48f07d246700
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
-
size
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:becb8e7b9b21dbbcd8c6f45d3f244819bc2cfc5fdc5240fae4ceb1a6da9d6991
|
| 3 |
+
size 8298
|
pytorch_model.bin
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 438003505
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:b5ade26c03228cf0bedb0c88d20624fe500f65aab7729ed4ba528d6623db0b8c
|
| 3 |
size 438003505
|