Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
artyomboyko
/
whisper-small-fine_tuned-ru
like
2
Automatic Speech Recognition
Transformers
PyTorch
TensorBoard
Safetensors
mozilla-foundation/common_voice_13_0
whisper
Generated from Trainer
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Metrics
Training metrics
Community
3
Train
Deploy
Use this model
5697a7f
whisper-small-fine_tuned-ru
3 contributors
History:
48 commits
artyomboyko
Training in progress, step 1000
5697a7f
10 months ago
runs
Training in progress, step 1000
10 months ago
.gitattributes
1.52 kB
initial commit
10 months ago
.gitignore
13 Bytes
Training in progress, step 500
10 months ago
README.md
1.97 kB
Update README.md
10 months ago
added_tokens.json
2.08 kB
Upload tokenizer
10 months ago
config.json
1.31 kB
Training in progress, step 500
10 months ago
generation_config.json
3.83 kB
Upload WhisperForConditionalGeneration
10 months ago
merges.txt
494 kB
Upload tokenizer
10 months ago
model.safetensors
967 MB
LFS
Adding `safetensors` variant of this model (#1)
10 months ago
normalizer.json
52.7 kB
Upload tokenizer
10 months ago
preprocessor_config.json
339 Bytes
Upload processor
10 months ago
pytorch_model.bin
pickle
Detected Pickle imports (3)
"torch.FloatStorage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
967 MB
LFS
Training in progress, step 1000
10 months ago
special_tokens_map.json
2.08 kB
Upload tokenizer
10 months ago
tokenizer_config.json
805 Bytes
Upload tokenizer
10 months ago
training_args.bin
pickle
Detected Pickle imports (8)
"transformers.training_args_seq2seq.Seq2SeqTrainingArguments"
,
"transformers.trainer_utils.SchedulerType"
,
"torch.device"
,
"transformers.training_args.OptimizerNames"
,
"transformers.trainer_utils.IntervalStrategy"
,
"accelerate.state.PartialState"
,
"accelerate.utils.dataclasses.DistributedType"
,
"transformers.trainer_utils.HubStrategy"
How to fix it?
4.09 kB
LFS
Training in progress, step 500
10 months ago
vocab.json
1.04 MB
Upload tokenizer
10 months ago