🔮 Mixture of Experts Collection MoE done using mergekit and LazyMergekit: https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb#scrollTo=d5mYzDo1q96y • 13 items • Updated Aug 16 • 22
👑 Monarch Collection Family of 7B models that combine excellent reasoning and conversational abilities. • 7 items • Updated Aug 16 • 11
Personal Favorites Collection Recommended models I use often or like for any reason. I recommend reading their cards for more details. • 9 items • Updated Aug 13 • 58
Quantized Models (GGUF, IQ, Imatrix) Collection Various quantizations of models in the GGUF format. Models with a "checkmark" are personal favorites. An "orange arrow" means it's being uploaded. • 89 items • Updated 26 days ago • 48
Transformers compatible Mamba Collection This release includes the `mamba` repositories compatible with the `transformers` library • 5 items • Updated Mar 6 • 36
The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits Paper • 2402.17764 • Published Feb 27 • 603
DreamGen Opus V1: Story-writing & role-playing models Collection Uncensored models for steerable story-writing and role-playing. Prompting guide: https://dreamgen.com/docs/models/opus/v1 • 16 items • Updated Jun 19 • 9