Instructions to use phucd/ko-better-old with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use phucd/ko-better-old with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("automatic-speech-recognition", model="phucd/ko-better-old")# Load model directly from transformers import AutoProcessor, AutoModelForSpeechSeq2Seq processor = AutoProcessor.from_pretrained("phucd/ko-better-old") model = AutoModelForSpeechSeq2Seq.from_pretrained("phucd/ko-better-old") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 9f0730e1784091a68e87b00bc93423608d26c1b02995b1eb914817e2179ef065
- Size of remote file:
- 4.4 kB
- SHA256:
- 610310396bdc5bdf46e089e711dc287863dafb2dafbd4703238e11a93272963b
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.