Text Generation
Transformers
Safetensors
mistral
Not-For-All-Audiences
nsfw
text-generation-inference
Inference Endpoints
Edit model card

exl2 version of Norquinal/PetrolLM-CollectiveCognition
used dataset : wikitext
quantized by IHaBiS

command : python convert.py -i models/Norquinal_PetrolLM-CollectiveCognition -o Norquinal_PetrolLM-CollectiveCognition-temp -cf Norquinal_PetrolLM-CollectiveCognition-4.125bpw-h8-exl2 -c 0000.parquet -l 4096 -b 4.125 -hb 8 -ss 4096 -m Norquinal_PetrolLM-CollectiveCognition_measurement.json

Below this sentence is original model card

What is PetrolLM-Claude-Chat?

PetrolLM-Claude-Chat is the CollectiveCognition-v1.1-Mistral-7B model with the PetrolLoRA applied.

The dataset (for the LoRA) consists of 2800 samples, with the composition as follows:

  • AICG Logs (~34%)
  • PygmalionAI/PIPPA (~33%)
  • Squish42/bluemoon-fandom-1-1-rp-cleaned (~29%)
  • OpenLeecher/Teatime (~4%)

These samples were then back-filled using gpt-4/gpt-3.5-turbo-16k or otherwise converted to fit the prompt format.

Prompt Format

The model uses the following prompt format: ```

style: roleplay characters: [char]: [description] summary: [scenario]

Format: [char]: [message] Human: [message] ```

Use in Text Generation Web UI

Install the bleeding-edge version of transformers from source:

pip install git+https://github.com/huggingface/transformers

Or, alternatively, change model_type in config.json from mistral to llama.

Use in SillyTavern UI

As an addendum, you can include one of the following as the Last Output Sequence:

Human: In your next reply, write at least two paragraphs. Be descriptive and immersive, providing vivid details about {{char}}'s actions, emotions, and the environment.
{{char}}:
{{char}} (2 paragraphs, engaging, natural, authentic, descriptive, creative):
[System note: Write at least two paragraphs. Be descriptive and immersive, providing vivid details about {{char}}'s actions, emotions, and the environment.]
{{char}}:

The third one seems to work the best. I would recommend experimenting with creating your own to best suit your needs.

Downloads last month
16
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train IHaBiS/PetrolLM-CollectiveCognition-4.125bpw-h8-exl2