Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
gosshh
/
output
like
0
Text Classification
Transformers
TensorBoard
Safetensors
Habana
bert
Generated from Trainer
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Metrics
Training metrics
Community
Train
Deploy
Use this model
b57095a
output
1 contributor
History:
16 commits
gosshh
Training in progress, step 3500
b57095a
verified
4 months ago
runs
Training in progress, step 500
4 months ago
.gitattributes
1.52 kB
initial commit
4 months ago
README.md
1.97 kB
Model save
4 months ago
config.json
859 Bytes
Training in progress, step 500
4 months ago
emissions.csv
743 Bytes
Model save
4 months ago
gaudi_config.json
246 Bytes
Training in progress, step 500
4 months ago
model.safetensors
438 MB
LFS
Training in progress, step 3500
4 months ago
special_tokens_map.json
125 Bytes
Training in progress, step 500
4 months ago
spiece.model
760 kB
LFS
Training in progress, step 500
4 months ago
tokenizer.json
712 kB
Training in progress, step 500
4 months ago
tokenizer_config.json
1.19 kB
Training in progress, step 500
4 months ago
training_args.bin
pickle
Detected Pickle imports (9)
"transformers.trainer_pt_utils.AcceleratorConfig"
,
"torch.device"
,
"optimum.habana.transformers.training_args.GaudiTrainingArguments"
,
"transformers.trainer_utils.SchedulerType"
,
"optimum.habana.accelerate.utils.dataclasses.GaudiDistributedType"
,
"transformers.training_args.OptimizerNames"
,
"transformers.trainer_utils.HubStrategy"
,
"optimum.habana.accelerate.state.GaudiPartialState"
,
"transformers.trainer_utils.IntervalStrategy"
How to fix it?
4.79 kB
LFS
Training in progress, step 500
4 months ago
vocab.txt
232 kB
Training in progress, step 500
4 months ago