What is the required GPU size to run Is a 4090 possible and does it support ollama

#5
by sminbb - opened

What is the required GPU size to run
Is a 4090 possible and does it support ollama

4090 should be good enough. Yes ollama would be helpful since these are GGUF files. However you will have to import GGUF in ollama.

Unsloth AI org

What is the required GPU size to run
Is a 4090 possible and does it support ollama

Yes 4090 is enough. You don't need a GPU, CPU with 48GB RAM will be enough.

At the moment Ollama does not support it as far as I'm aware of so you will need to use llama.cpp

Sign up or log in to comment