Some of my own quants:

  • Emerhyst-20B_Q3_K_M.gguf
  • Emerhyst-20B_Q4_K_M.gguf
  • Emerhyst-20B_Q5_K_M.gguf

Source: Undi95

Source Model: Emerhyst-20B

Source models for Undi95/Emerhyst-20B (Merge)

Downloads last month
14
GGUF
Model size
20B params
Architecture
llama
Hardware compatibility
Log In to view the estimation

3-bit

4-bit

5-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support