Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
golesheed
/
whisper-v2-7fold-1
like
0
Automatic Speech Recognition
Transformers
TensorBoard
Safetensors
Dutch
whisper
Generated from Trainer
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Metrics
Training metrics
Community
Train
Deploy
Use this model
fee78a3
whisper-v2-7fold-1
1 contributor
History:
511 commits
This model has 1 file scanned as unsafe.
Show
files
golesheed
Training in progress, step 7770
fee78a3
verified
29 days ago
runs
Training in progress, step 7770
29 days ago
.gitattributes
1.52 kB
initial commit
about 2 months ago
README.md
5.17 kB
Upload tokenizer
about 2 months ago
added_tokens.json
34.6 kB
Upload tokenizer
about 2 months ago
config.json
1.28 kB
Training in progress, step 15
about 2 months ago
merges.txt
494 kB
Upload tokenizer
about 2 months ago
model-00001-of-00002.safetensors
4.99 GB
LFS
Training in progress, step 7770
29 days ago
model-00002-of-00002.safetensors
1.18 GB
LFS
Training in progress, step 7770
29 days ago
model.safetensors.index.json
112 kB
Training in progress, step 15
about 2 months ago
normalizer.json
52.7 kB
Upload tokenizer
about 2 months ago
preprocessor_config.json
339 Bytes
Training in progress, step 15
about 2 months ago
special_tokens_map.json
2.19 kB
Upload tokenizer
about 2 months ago
tokenizer_config.json
283 kB
Upload tokenizer
about 2 months ago
training_args.bin
Unsafe
pickle
Detected Pickle imports (11)
"transformers.trainer_utils.IntervalStrategy"
,
"accelerate.utils.dataclasses.DistributedType"
,
"transformers.trainer_pt_utils.AcceleratorConfig"
,
"transformers.trainer_utils.SchedulerType"
,
"transformers.trainer_utils.SaveStrategy"
,
"transformers.training_args.OptimizerNames"
,
"accelerate.state.PartialState"
,
"transformers.training_args_seq2seq.Seq2SeqTrainingArguments"
,
"torch.device"
,
"__builtin__.getattr"
,
"transformers.trainer_utils.HubStrategy"
How to fix it?
5.56 kB
LFS
Training in progress, step 7770
29 days ago
vocab.json
1.04 MB
Upload tokenizer
about 2 months ago