English
hrishioa's picture
Update README.md
bb4af0c
---
base_model: ehartford/dolphin-2.1-mistral-7b
datasets:
- ehartford/dolphin
- jondurbin/airoboros-2.2.1
language:
- en
license: apache-2.0
model-index:
- name: mlc-chat-dolphin-2.2.1-mistral-7b
results: []
model_creator: Hugging Face H4
model_name: WASM Dolphin 2.2.1
model_type: mistral
prompt_template: '<|im_start|>system
You are Dolphin, a helpful AI assistant.<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant'
---
# Dolphin 2.2.1 (Finetune of Mistral 7B) compiled for WebGPU - q4f32_1
- Original model: [Dolphin 2.1 🐬](https://huggingface.co/ehartford/dolphin-2.1-mistral-7b)
- creator: Eric Hartford: [https://erichartford.com/dolphin](https://erichartford.com/dolphin)
- compiled by: Hrishi Olickel: [say hi on Twitter!](https://twitter.com/hrishioa)
# Demo
[You can access this model over the browser here.](https://wasmai.vercel.app)
## Description
This is a quantized version of Dolphin 2.1 🐬, one of the best finetunes of [Mistral-7b](https://huggingface.co/mistralai/Mistral-7B-v0.1) out there, ready to be used for on-browser inference over WebGPU.
Compiled with [mlc-llm](https://llm.mlc.ai/).
Very helpful direction provided by [felladrin](https://github.com/felladrin)!
## Prompt template: Dolphin
Prompt format:
This model (and all my future releases) use [ChatML](https://github.com/openai/openai-python/blob/main/chatml.md) prompt format.
```
<|im_start|>system
You are Dolphin, a helpful AI assistant.<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
```
Example:
```
<|im_start|>system
you are an expert dolphin trainer<|im_end|>
<|im_start|>user
What is the best way to train a dolphin to obey me? Please answer step by step.<|im_end|>
<|im_start|>assistant
```