Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
dgkim0306
/
wav2vec2-large-xls-r-300m-korean-g
like
0
Automatic Speech Recognition
Transformers
TensorBoard
Safetensors
make_dataset
wav2vec2
Generated from Trainer
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Metrics
Training metrics
Community
Train
Deploy
Use this model
e5e10a2
wav2vec2-large-xls-r-300m-korean-g
Commit History
Upload tokenizer
e5e10a2
verified
dgkim0306
commited on
Jun 28
Upload tokenizer
cd5b315
verified
dgkim0306
commited on
Jun 4
Training in progress, step 1000
f684c1e
verified
dgkim0306
commited on
Jun 4
Training in progress, step 500
82ffe1c
verified
dgkim0306
commited on
Jun 4
Upload tokenizer
42c19c6
verified
dgkim0306
commited on
Jun 4
Upload tokenizer
4e26720
verified
dgkim0306
commited on
Jun 4
Upload tokenizer
920ff86
verified
dgkim0306
commited on
Jun 4
Upload tokenizer
0f6be86
verified
dgkim0306
commited on
Jun 4
Upload tokenizer
d034e75
verified
dgkim0306
commited on
Jun 4
Upload tokenizer
7ac2d56
verified
dgkim0306
commited on
Jun 4
Upload tokenizer
7c9a707
verified
dgkim0306
commited on
Jun 4
Upload tokenizer
14ad092
verified
dgkim0306
commited on
Jun 4
Upload tokenizer
28d2142
verified
dgkim0306
commited on
Jun 4
End of training
076b1b3
verified
dgkim0306
commited on
Mar 15
Training in progress, step 2000
e9d23ac
verified
dgkim0306
commited on
Mar 15
Training in progress, step 1500
089309a
verified
dgkim0306
commited on
Mar 15
Training in progress, step 1000
b951d75
verified
dgkim0306
commited on
Mar 15
Training in progress, step 500
bf611ac
verified
dgkim0306
commited on
Mar 15
Upload tokenizer
f1ccdda
verified
dgkim0306
commited on
Mar 15
End of training
0e79a28
verified
dgkim0306
commited on
Mar 12
Upload tokenizer
e7341e5
verified
dgkim0306
commited on
Mar 12
End of training
6ecc9e4
verified
dgkim0306
commited on
Mar 12
Upload tokenizer
8b3a11d
verified
dgkim0306
commited on
Mar 12
End of training
4a08e32
verified
dgkim0306
commited on
Mar 12
Upload tokenizer
4fdb5ed
verified
dgkim0306
commited on
Mar 12
Upload tokenizer
eb0d4ba
verified
dgkim0306
commited on
Mar 12
Upload tokenizer
2e17ab9
verified
dgkim0306
commited on
Mar 12
Upload tokenizer
3c8db87
verified
dgkim0306
commited on
Mar 12
Upload tokenizer
bf4452e
verified
dgkim0306
commited on
Mar 12
Upload tokenizer
a764e98
verified
dgkim0306
commited on
Mar 12
Training in progress, step 2000
80dd0b0
verified
dgkim0306
commited on
Mar 5
Training in progress, step 1500
b16f5e2
verified
dgkim0306
commited on
Mar 5
Training in progress, step 1000
8d2fe5c
verified
dgkim0306
commited on
Mar 5
Training in progress, step 500
0168a25
verified
dgkim0306
commited on
Mar 5
Upload tokenizer
a16ac50
verified
dgkim0306
commited on
Mar 5
Upload tokenizer
e389a41
verified
dgkim0306
commited on
Mar 5
Upload tokenizer
5ba82f3
verified
dgkim0306
commited on
Mar 5
Upload tokenizer
bd42f34
verified
dgkim0306
commited on
Mar 5
Upload tokenizer
f88e62e
verified
dgkim0306
commited on
Mar 5
Upload tokenizer
f8303ab
verified
dgkim0306
commited on
Mar 5
Upload tokenizer
03d2ab2
verified
dgkim0306
commited on
Mar 5
Upload tokenizer
d0dd0ff
verified
dgkim0306
commited on
Mar 5
Upload tokenizer
315dfcb
verified
dgkim0306
commited on
Mar 4
Upload tokenizer
ee6b32e
verified
dgkim0306
commited on
Feb 29
Upload tokenizer
5c6914b
verified
dgkim0306
commited on
Feb 29
Upload tokenizer
fa2f14b
verified
dgkim0306
commited on
Feb 29
Upload tokenizer
7f248df
verified
dgkim0306
commited on
Feb 29
Upload tokenizer
ddb82a2
verified
dgkim0306
commited on
Feb 29
Upload tokenizer
9788d28
verified
dgkim0306
commited on
Feb 29
Upload tokenizer
e01b584
verified
dgkim0306
commited on
Feb 29
Previous
1
2
Next