Text Classification
Transformers
Safetensors
English
emcoder
feature-extraction
emotion-recognition
bayesian-deep-learning
mc-dropout
uncertainty-quantification
multi-label-classification
custom_code
Eval Results (legacy)
Instructions to use yezdata/EmCoder with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use yezdata/EmCoder with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="yezdata/EmCoder", trust_remote_code=True)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("yezdata/EmCoder", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Update train_config.json
Browse files- train_config.json +11 -9
train_config.json
CHANGED
|
@@ -1,10 +1,12 @@
|
|
| 1 |
-
{
|
| 2 |
-
"
|
| 3 |
-
"
|
| 4 |
-
"
|
| 5 |
-
"
|
| 6 |
-
"
|
| 7 |
-
"
|
| 8 |
-
"
|
| 9 |
-
"
|
|
|
|
|
|
|
| 10 |
}
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"bayesian_train": true,
|
| 3 |
+
"loss_weights": "log",
|
| 4 |
+
"tokenized_ds_dir": "data/goemotions_v1_seq512",
|
| 5 |
+
"encoder_lr": 0.00001,
|
| 6 |
+
"head_lr": 0.0005,
|
| 7 |
+
"lr_warmup": 0.05,
|
| 8 |
+
"weight_decay": 0.01,
|
| 9 |
+
"batch_size": 32,
|
| 10 |
+
"gradient_accumulation_steps": 8,
|
| 11 |
+
"num_epochs": 10
|
| 12 |
}
|