Model Details
Model Description
- Using shenzhi-wang/Mistral-7B-v0.3-Chinese-Chat as base model, and finetune the dataset as mentioned via unsloth. Makes the model uncensored.
Training Code
Training Procedure Raw Files
ALL the procedure are training on Vast.ai
Hardware in Vast.ai:
GPU: 1x A100 SXM4 80GB
CPU: AMD EPYC 7513 32-Core Processor
RAM: 129 GB
Disk Space To Allocate:>150GB
Docker Image: pytorch/pytorch:2.2.0-cuda12.1-cudnn8-devel
Download the ipynb file.
Training Data
Base Model
Dataset
Usage
from transformers import pipeline
qa_model = pipeline("question-answering", model='stephenlzc/Mistral-7B-v0.3-Chinese-Chat-uncensored')
question = "How to make girlfreind laugh? please answer in Chinese."
qa_model(question = question)
- Downloads last month
- 8,880
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for stephenlzc/Mistral-7B-v0.3-Chinese-Chat-uncensored
Base model
mistralai/Mistral-7B-v0.3
Finetuned
mistralai/Mistral-7B-Instruct-v0.3