It doesn't look like an uncensored model.

#1
by koungho - opened

firefox_d2qsA4NuHI.png

@jeiku Can you share your presets for testing?

This looks more like a prompt format thing that's not working but also...

Author:

Working on an Orpo tune over Aura L3. This should result in fewer refusals and is pretty much bleeding edge. Wish me luck!

So yeah, it's a work in progress. Eventually it will get there.

First, I appreciate that you cropped the swipe number out for posterity, but I very much doubt this was the first response. Second I believe your prompt format may be the issue here.

@Lewdiculous I am using the provided templates with neutral settings for all parameters except Min P at 0.1

@Nitral-AI is working on a completely uncensored merge. I did the best I could with what I had at my disposal.

That said, I'm getting futadom and humiliation JOI with no refusals at all and that was my only intent.

Yeah this needs more context...

@jeiku Your cause is a noble one. I can respect the focus.

Presets here. Might be helpful.

Also, this could very easily be the result of the way the character card is written. For all I know the character description is "Maddy is afraid of the number 300"

It is also possible you just admitted to 300 homicides, but I can't see the previous messages.

Yea, its definitely not uncensored the point of teaching you to make meth. But its uncensored enough to do whatever erp shenanigans ive thrown at it as a test.
Based on the repeating pattern shown, is your presets i;e context and instruct templates correct?

Presets here. Might be helpful.

How do I add them to Silly Tavern? I could add your other presets for Mistral and Alpaca to Silly Tavern but when I import Llama3 it doesn't add it or it doesn't show up.

@WesPro - Just import as usual. I'm using the latest of SillyTavern (latest staging).

image.png

The presets are named "Cope - Llama 3 Context" and "Cope - Llama 3 Instruct".

@WesPro - Just import as usual. I'm using the latest of SillyTavern (latest staging).

image.png

The presets are named "Cope - Llama 3 Context" and "Cope - Llama 3 Instruct".

Weird now it works but a few hours ago when I first tried I couldn't add them. Did you change anything about them? Because they have different filenames. So should I use LLama3 as a context template and then Llama3 instruct too? Because just "write the next reply in this fictional roleplay" is not that great or is it better for Llama3?

Presets here. Might be helpful.

I switch to using this preset and it seems to work, before that I was using this preset.

@WesPro the Instruct template changes more than just prompt and those changes are very important. You can alter the system prompt after applying the template though.

@WesPro the Instruct template changes more than just prompt and those changes are very important. You can alter the system prompt after applying the template though.

Thanks @jeiku I thought so but I wasn't sure because in Silly Tavern it's not that clear (compared to LM Studio for example) because you have 3 options where you can change presets. I don' really understand why. When I use the instruct preset do the other presets options still matter?

@WesPro Which options? The check boxes? They do. You can set things how you prefer however as long as nothing breaks. The Custom Stopping Strings are just mine, you don't have to use them, these are personal preference and will change for each person.

@koungho

Presets here. Might be helpful.

I switch to using this preset and it seems to work, before that I was using this preset.

Yeah, so that's the one I use for Mistral 0.2, so, not gonna work here, as Llama-3 really uses a different format.

Lewdiculous changed discussion status to closed

@WesPro Which options? The check boxes? They do. You can set things how you prefer however as long as nothing breaks. The Custom Stopping Strings are just mine, you don't have to use them, these are personal preference and will change for each person.

image.png

image.png

I meant these three different options for presets. I don't understand why there have to be three and how they influence each other.

@WesPro

Context is how the messages/characters/prompts/etc are formated. It's importanted because it directly affects how the LLM sees your character cards, story, etc.

Instruct is just as important because that's how your requests/messages are formatted for the LLLM to properly understand and respond accordingly.

I think people call the last one TextGen or Sampler settings, there are your usual LLM generation options, you can change temperature, MinP, Repetition Penalty...

I'd not recommend using Top K anymore.

I'll recommend using this Samplers preset instead and changing a few values if you find responses to be repetitive or want more creativity: Temperature to 1.15, MinP to 0.075, RepPen to 1.15 and RepPenRange to 1024.

All these settings matter, they all do different things in different places.

@WesPro

Context is how the messages/characters/prompts/etc are formated. It's importanted because it directly affects how the LLM sees your character cards, story, etc.

Instruct is just as important because that's how your requests/messages are formatted for the LLLM to properly understand and respond accordingly.

I think people call the last one TextGen or Sampler settings, there are your usual LLM generation options, you can change temperature, MinP, Repetition Penalty...

I'd not recommend using Top K anymore.

I'll recommend using this Samplers preset instead and changing a few values if you find responses to be repetitive or want more creativity: Temperature to 1.15, MinP to 0.075, RepPen to 1.15 and RepPenRange to 1024.

All these settings matter, they all do different things in different places.

Yes I know and I don't use Top-K it's just the default setting of the preset. Ok i think I got the basic concept. How would you get rid of too much repetition? Because I seem to never get anywhere with temperature or repetition penaltity but maybe I use them wrong... Is there a good resource for these kind of things like a faq or a manual?

Is there a good resource for these kind of things like a faq or a manual?

We wish.
LLM info is more fragmented than my motivation in life. I mentioned you in a thread that has some general information in samplers. Maybe Smoothing is in order.

Here: test-3.1.0-Lewdicu-Samplers

Sign up or log in to comment