Instructions to use theta/deeper with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use theta/deeper with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="theta/deeper")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("theta/deeper") model = AutoModelForSequenceClassification.from_pretrained("theta/deeper") - Notebooks
- Google Colab
- Kaggle
Training in progress, step 260
Browse files
pytorch_model.bin
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 409146485
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:c4b1ee792d7cbccb3308e5ebd4b6a2ed175a868665add6903b08540157f83c98
|
| 3 |
size 409146485
|
runs/Jan31_12-29-03_836080a9568c/events.out.tfevents.1675168148.836080a9568c.1543.6
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
-
size
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:7ff5e71b886b444087ee828dec15018e607476dd4803fac3481b78e86a9fc9ea
|
| 3 |
+
size 9442
|