Noobs's question

#1
by kekawia - opened

can i run this ?
-- RTX 3060 12go vram / 32 ram --

kekawia changed discussion status to closed
kekawia changed discussion status to open

@kekawia
yeah you can 100% run this but probably not at 290k context. Awq relies mostly on vram so not a lot of context(maybe 8k?) Gguf quants with llama.cpp should help you run much more context since it can utilize cpu and it might reach like 60k context? definitely much more then 8k however

Sign up or log in to comment