HACM7-Mistral-7B / README.md
Sao10K's picture
Update README.md
57ac5a7
metadata
license: apache-2.0
language:
  - en
datasets:
  - teknium/openhermes
  - jondurbin/airoboros-2.2

An attempt at a Generalist Model, using chunks spliced from the Airoboros and OpenHermes dataset, and some of my own stuff which I have since discarded (chat stuff, basic shit)forgot i added CollectiveCognition to the mix.

Base Model used is a gradient merge between OpenHermes & Airoboros 2.2, lora trained on the two spliced datasets and my own one, which was reapplied to them (dumb move I know)

Tests: Uncensored, not bad at RP I Guess, it felt decent. Good? who knows, tbh. try it urself or smth