Text Generation
Transformers
mistral
conversational
Inference Endpoints
text-generation-inference
4-bit precision
Edit model card

Mika (Named after what my Claude-3 Opus chat called itself.) is a Model trained in a similar manner to Fett-uccine with synthetic RP data created by Claude also included.

Format

I've had the best results with ChatML Context Template and Mistral Instruct Template, however, YMMV.

Downloads last month
9

Datasets used to train Epiculous/Mika-7B-GPTQ

Collection including Epiculous/Mika-7B-GPTQ