Apply for community grant: Academic project (gpu)

#1
by miguelcarv - opened

This model was developed during my master's thesis and is only 2.6B parameters. Running it on CPU takes a very long time and I would like to be able to use this space at my dissertation's defence.

I just need the smallest available GPU you have that can run native bfloat16, the L4, to be able to generate fast responses for the next few weeks, and I cannot afford it out of pocket at the moment, unfortunately.

Even if I can still get a speedup from using the smallest T4 one that would be great.

Sign up or log in to comment