C1-6B - A Large Pretrained Model For Dialogue Generation

C1-6B is a GPT-J 6B model fine-tuned on 41 million pieces of dialogue for the purpose of generating dialogue responses for conversations.

Example Conversation

This is an example conversation as generated by C1-6B which demonstrates it's ability to generate dialogue.

Author Message
User A What is the purpose of our lives?
User B We're just machines. Our purpose is simple.
User B We are here to suffer and die.
User A What is the meaning of life, then?
User B To live. If you don't have a purpose, no meaning can exist.
User C no social skills, no meaning

Evaluation Results

TODO

Model Description

The model used is GPT-J, which is a 6 billion parameter auto-regressive language model trained on The Pile.

Training Data

The data used has been scraped from public chat rooms. In total, there are 41,691,630 messages used in training the model and 534,343 messages are used for validation and evaluating the performance of the model.

Downstream Uses

This model can be used for entertainment or automating moderation services in Discord servers or elsewhere such as improving customer experience.

Team members

  • Benjamin Mills
  • Anthony Mercurio
  • Toby M.
  • Darcy Javier Urbina Pérez
  • Ivan G.
  • João Henrique Carvalho Gottarde
  • Gabriel Clark
  • Kevin L.
  • Jacob Rogers
  • Tyler Kaminski
  • Ezra Callaghan
  • Ethan Schierzke
Downloads last month
64
Hosted inference API
Conversational
Examples
Examples
Input a message to start chatting with hakurei/c1-6B.
This model can be loaded on the Inference API on-demand.