Edit model card

Description

Do you miss the vibes of the early 2000s? Yearn for the nostalgia of internet religious arguments? Then this model is for you!

This was trained on a scrape of Yahoo! Answers from 2007 and received no filtering save for basic sanity checks.

This is not intended for serious use but I think it's charming in a way.

Prompt format:

Pygmalion / Metharme

The prompt should start with the cursor on the same line directly after "<|model|>" with no space. The following are all valid formats and can be extended to as many rounds as desired.

<|system|>system message here<|user|>user message here<|model|>
<|system|>system message here<|user|>user message here<|model|>model message<|user|>user message here<|model|>
<|system|>system message here<|model|>
<|system|>system message here<|model|>model message<|user|>user message here<|model|>

Some quick and dirty training details:

  • Built with Axolotl
  • Sequence length: 2048
  • Training time: 32 hours
  • Hardware: 1x RTX 4080
  • Training type: QLoRA
  • PEFT R/A: 32/32
Downloads last month
3,561

Finetuned from

Dataset used to train Dans-DiscountModels/Dans-07YahooAnswers-7b