20B

#1
by R136a1 - opened

I really like the Noromaid-20b-v0.1.1 and want to try the v0.2

NeverSleep org

We dont know if its worth it making a 20b for v0.2, it costs us money to make this stuff. We will need more feedback on this version and then we can decide if its worth it making a 20b for v0.2.

What does it cost? A few hundred dollars? Ever thought about a Patreon?

NeverSleep org

What does it cost? A few hundred dollars? Ever thought about a Patreon?

We run on Kofi donation and our own money, here is the cost for today training for exemple (we do an entire new try on the 8x7b atm)

image.png

It go fast. New version soon of 7B, 13B and 8x7B if all work out, stay tuned.
And maybe 20B, if we have the time haha

Oh, nice. I didn't notice that in your profile. I made a small donation to help you along.

Is there any interest or thought or kind of effort towards MoE 2x ??? Does 2x make sense for you guys?

NeverSleep org

Is there any interest or thought or kind of effort towards MoE 2x ??? Does 2x make sense for you guys?

We can only does MoE of power of 2, but not two.
So 4, 8, 16, 32...
That's how it work.

Sry for lame question and thanks for response

NeverSleep org

Sry for lame question and thanks for response

All good!

once more again me ... I hope I don't anoy with my lameness, but for example Maxime Labonne speaks about that 2x mixes can be done... here with phi.. https://twitter.com/maximelabonne/status/1744867841436700850

and also Cros Nastasi didn't deny it here https://huggingface.co/dillfrescott/sonya-7b-x8-MoE/discussions/1

What am I missing? (I am betting on your patience with me, knowing its not infinite....)

NeverSleep org

You don't miss anything, maybe it's possible now, but llama.cpp just blocked it before. Need to check

Doesn't a MoE imply expert routing/balancing and mixing? With two wouldn't you either have no balancing (it always chose both experts to mix) or no mixing (if it could only choose one to route to per token). At that point isn't a larger single model more simple and has the benefit of monolithic scale? I mean I'm not qualified to judge that tweet but I guess I don't understand what is going on either.

Sign up or log in to comment