File size: 1,757 Bytes
652d5f6
 
 
 
 
 
 
 
 
6bfa6ac
efde73a
652d5f6
 
 
 
 
f62a0b0
 
 
652d5f6
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
---
tags:
- not-for-all-audiences
---

Eileithyia-7B is an unaligned, roleplay oriented model created by merging [teknium/OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) with [a bespoke LORA](https://huggingface.co/athirdpath/Eileithyia-7B-LORA) trained directly on OpenHermes.

Eileithyia, as is the current trend, is named after a Greek goddess; in this case it is the goddess of childbirth and pregnancy.

![image/png](https://i.ibb.co/zR1CX4G/ele.png)

The private ~400k token dataset used to train the LORA was Alpaca formatted and focused on 4 primary categories:

- Medical texts (on pregnancy, reproductive organs, and impregnation). These are formatted so the model, in character as a doctor, answers a patient's question in short to medium form.
- Excerpts from short stories and novellas (erotic, romantic, and platonic) centered around both realistic and fantastic pregnancy. These are sliced into ~2048 token chunks, and these long-form responses are all tied to the command “Enter narrator mode.” in the instructions.
- A selection from [PIPPA](https://huggingface.co/datasets/PygmalionAI/PIPPA), using a wide keyword search for related terms then human curated (...the things I’ve seen…). These are converted to Alpaca with “Enter RP mode.” in all the instruction fields.
- ~42k tokens of GPT-4 generated data on pregnancy from various characters’ perspectives, focusing on different responses and stages. Also includes a synopsis for each week in various styles.
- ~18k tokens of GPT-4 generated data on non-maternal role-playing from various characters’ perspectives, focusing on different situations and emotions. Includes many multi-turn conversations.
  
Testing is still in progress.