Running 416 416 LLM Model VRAM Calculator π Calculate VRAM requirements for running large language models
cognitivecomputations/dolphin-2.7-mixtral-8x7b Text Generation β’ Updated Oct 30, 2024 β’ 1.67k β’ 171
cognitivecomputations/dolphin-2.5-mixtral-8x7b Text Generation β’ Updated May 21, 2024 β’ 11.3k β’ 1.23k
Running on CPU Upgrade 9.86k 9.86k AI Comic Factory π© Create your own AI comic with a single prompt