Edit model card

dolly-v2-7b: sharded checkpoint

Open In Colab

This is a sharded checkpoint (with ~4GB shards) of the databricks/dolly-v2-7b model. Refer to the original model for all details.

  • this enables low-RAM loading, i.e. Colab :)

Basic Usage

install transformers, accelerate, and bitsandbytes.

pip install -U -q transformers bitsandbytes accelerate

Load the model in 8bit, then run inference:

from transformers import AutoTokenizer, AutoModelForCausalLM

model_name = "ethzanalytics/dolly-v2-7b-sharded"
tokenizer = AutoTokenizer.from_pretrained(model_name)

model = AutoModelForCausalLM.from_pretrained(
          model_name, load_in_8bit=True, device_map="auto",
        )
Downloads last month
16
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train ethzanalytics/dolly-v2-7b-sharded