🔮 Mixture of Experts Collection MoE done using mergekit and LazyMergekit: https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb#scrollTo=d5mYzDo1q96y • 13 items • Updated May 27 • 21
👑 Monarch Collection Family of 7B models that combine excellent reasoning and conversational abilities. • 7 items • Updated May 27 • 9
Personal Favorites Collection Recommended models I use often or like for any reason. I recommend reading their cards for more details. • 8 items • Updated May 21 • 43
Quantized Models (GGUF, IQ, Imatrix) Collection Various quantizations of models in the GGUF format. Models with a "checkmark" are personal favorites. An "orange arrow" means it's being uploaded. • 81 items • Updated 29 days ago • 42
Transformers compatible Mamba Collection This release includes the `mamba` repositories compatible with the `transformers` library • 5 items • Updated Mar 6 • 32
The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits Paper • 2402.17764 • Published Feb 27 • 581
DreamGen Opus V1: Story-writing & role-playing models Collection Uncensored models for steerable story-writing and role-playing. Prompting guide: https://dreamgen.com/docs/models/opus/v1 • 16 items • Updated Jun 19 • 8