Edit model card

UltraLM-65b

This is UltraLM-65b delta weights, a chat language model trained upon UltraChat

Model Details

Model Description

The model is fine-tuned based on LLaMA-65b with a multi-turn chat-format template as below

User: instruction 1
Assistant: response 1<eos_token>
User: instruction 2
Assistant: response 2<eos_token>
...
  • License: UltraLM is based on LLaMA and should be used under LLaMA's model license.
  • Finetuned from model: LLaMA-65b
  • Finetuned on data: UltraChat

Model Sources

Uses

To use this model, you need to recover the full model from the delta weights and perform inference following the template below:

[Optional]User: system prompt
User: user input
Assistant: 
Downloads last month
1,335

Dataset used to train openbmb/UltraLM-65b

Spaces using openbmb/UltraLM-65b 14

Collection including openbmb/UltraLM-65b