Mika (Named after what my Claude-3 Opus chat called itself.) is a Model trained in a similar manner to Fett-uccine with synthetic RP data created by Claude also included.

Format

I've had the best results with ChatML Context Template and Mistral Instruct Template, however, YMMV.

Downloads last month
156
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Epiculous/Mika-7B

Merges
7 models
Quantizations
2 models

Datasets used to train Epiculous/Mika-7B

Spaces using Epiculous/Mika-7B 6

Collection including Epiculous/Mika-7B