Text Generation
Transformers
PyTorch
Safetensors
mistral
text-generation-inference
Inference Endpoints
hflog's picture
Duplicate from RESMPDEV/Wukong-0.1-Mistral-7B-v0.2
c1e49e0 verified
metadata
license: apache-2.0
datasets:
  - teknium/OpenHermes-2.5
  - m-a-p/CodeFeedback-Filtered-Instruction
  - m-a-p/Code-Feedback
pipeline_tag: text-generation

Wukong-0.1-Mistral-7B-v0.2

Join Our Discord! https://discord.gg/cognitivecomputations

image/jpeg

Wukong-0.1-Mistral-7B-v0.2 is a dealigned chat finetune of the original fantastic Mistral-7B-v0.2 model by the Mistral team.

This model was trained on the teknium OpenHeremes-2.5 dataset, code datasets from Multimodal Art Projection https://m-a-p.ai, and the Dolphin dataset from Cognitive Computations https://erichartford.com/dolphin 🐬

This model was trained for 3 epochs over 4 4090's.

Example Outputs

TBD

Built with Axolotl