Edit model card

UltraLM-65b

This is UltraLM-65b delta weights, a chat language model trained upon UltraChat

Model Details

Model Description

The model is fine-tuned based on LLaMA-65b with a multi-turn chat-format template as below

User: instruction 1
Assistant: response 1<eos_token>
User: instruction 2
Assistant: response 2<eos_token>
...
  • License: UltraLM is based on LLaMA and should be used under LLaMA's model license.
  • Finetuned from model: LLaMA-65b
  • Finetuned on data: UltraChat

Model Sources

Uses

To use this model, you need to recover the full model from the delta weights and perform inference following the template below:

[Optional]User: system prompt
User: user input
Assistant: 
Downloads last month
1,012
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train openbmb/UltraLM-65b

Spaces using openbmb/UltraLM-65b 19

Collection including openbmb/UltraLM-65b