And I JUST started training over 20b models...

#3
by athirdpath - opened

Hey Undi, thanks so much for all you do, your work on U-Amethyst is what pushed me away from SD and to working with LLMs.

Do you intend to release the mergekit recipe you used? I'm curious if I could tease Iambe-RP apart and squeeze in the other 3b parameters from a boring STEM model. Might improve Iambe's problem solving.

Also, if you're willing to share but not publicly, I'm available on TheBloke's DC

Wait, someone liked this and I thought it was new, I assume you dropped this line of research for a reason.

athirdpath changed discussion status to closed

Sure dm me on discord, I didn't found your discord name on TheBloke DC
Edit: It's not new, but it's usable haha, I use it sometime, I finetuned Noromaid dataset over it for personal use

Good afternoon. You have very good models for RP. Can you make 20B models with 16k context?

Good afternoon. You have very good models for RP. Can you make 20B models with 16k context?

I redirect you here: https://huggingface.co/Undi95/Emerhyst-20B/discussions/4#65732dbc68ea3e91dc2b7599

Sign up or log in to comment