Instructions to use FiveC/Bart-Bahnar-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use FiveC/Bart-Bahnar-base with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("FiveC/Bart-Bahnar-base") model = AutoModelForSeq2SeqLM.from_pretrained("FiveC/Bart-Bahnar-base") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- bad3ea8da24bfb9236ccd4340f72c871da9ca0788d30863dde918709c413b2a2
- Size of remote file:
- 5.39 kB
- SHA256:
- f5f23034572a24b044542fa274e27d067d18cafafd73a2d7b4d4f4e534843a8e
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.