Text Generation
Transformers
PyTorch
Safetensors
English
llama
Eval Results
Inference Endpoints
text-generation-inference

Generated with:

from transformers import AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("pankajmathur/orca_mini_3b")
assert tokenizer.is_fast
tokenizer.save_pretrained("...")
pankajmathur changed pull request status to merged

Thank You @ScottMueller , appreciated, just merged it to main. let me know if there are any issues.

Sign up or log in to comment