Edit model card

Chizuru Ichinose as a DialoGPT model

This model is a fine-tuned version of DialoGPT-medium trained on the Chizuru Ichinose conversational dataset.

We recommend using one of the Transformers pre-built pipelines to keep context without too much work when running this or any DialoGPT model:

from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline, Conversational

# load tokenizer and model
tokenizer = AutoTokenizer.from_pretrained("alexandreteles/GPTChizuru")
model = AutoModelForCausalLM.from_pretrained("alexandreteles/GPTChizuru")

# create pipeline
pipe = pipeline(task="conversational", model=model, tokenizer=tokenizer)

# generate response
print(pipe(Conversation("How are you?")))
Downloads last month
18
Safetensors
Model size
380M params
Tensor type
F32
·
U8
·

Dataset used to train alexandreteles/GPTChizuru

Evaluation results