How to use model in local Env for inference ? From where I can download .bin model file?

#4
by Shubh0318 - opened

I did not find .bin file for model in files & version how can i use this model in local with my web app for inference any solution

Sign up or log in to comment