Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

EXL2 @ 5.00bpw
Wanted to try something a little different.
I haven't had much luck with smaller models, but as a Noromaid superfan, I did have success with Silicon Maid.
This is a new model I stumbled on via Reddit, but it was too new to have an EXL2.
So I made it my damn self.
I'm still tinkering, but have high hopes.
All credit to the original creators: This one has lots of good DNA.

image/png

DaringLotus-10.7B-v2

This is a dare ties merge of https://huggingface.co/BlueNipples/SnowLotus-v2-10.7B and it's parent models. It shares it's good prose, and relatively decent coherency, being a little bit more on the side of prose, and a little bit less on the side of coherency. I like this model for generating great prose if I feel like regening a bit. It's a good model as is the other model for RP, and I think both these merged models probably stand up with the best in their weight class (11-13). Which you prefer might be a matter of context and preference which is why I've uploaded both. Credit to Nyx and Sao10k for their models contributions (Frostmaid, FrostWind and SolarDoc), as well as Undi95 and Ikari for Noromaid, the developers of Mergekit, and whomever contributed the medical model used in the frankenmerge portion.

GGUF (Small selection of Imatrix and regular k-quants): https://huggingface.co/BlueNipples/DaringLotus-SnowLotus-10.7b-IQ-GGUF

Recipe

  • model: ./Frostmaid parameters: density: [0.45] # density gradient weight: 0.23
  • model: ./FrostMed parameters: density: [0.35] # density gradient
    weight: 0.18
  • model: ./SnowLotus-10.7B-v2 parameters: density: [1] # density gradient weight: 1
Downloads last month
4