does all safe tensors needed to be downloaded to use this model on colab?

#14
by Kv-boii - opened

I'm looking for small scale models to use since the free plan on google colaboratory just provides 15 gb of gpu which isnt enough to work on LLMs on a trial and error basis

Yes. I think the model can hold in 4bit for inference on Google Colab Pro+ (A100, 40g) but you’ll have to wait for more developments in quantization for anything usable on the free version.

is there any free notebook service that provide higher gpu than google colab free version

Sign up or log in to comment