dolly-v2-7b-sharded / README.md
pszemraj's picture
Update README.md
e0d26bd
|
raw
history blame
951 Bytes
metadata
license: mit
datasets:
  - databricks/databricks-dolly-15k
language:
  - en
pipeline_tag: text-generation
tags:
  - dolly
  - dolly-v2
  - instruct
  - sharded

dolly-v2-7b: sharded checkpoint

This is a sharded checkpoint (with ~4GB shards) of the databricks/dolly-v2-7b model. Refer to the original model for all details.

  • this enables low-RAM loading, i.e. Colab :)

Basic Usage

install transformers, accelerate, and bitsandbytes.

pip install -U -q transformers bitsandbytes accelerate

Load the model in 8bit, then run inference:

from transformers import AutoTokenizer, AutoModelForCausalLM

model_name = "ethzanalytics/dolly-v2-7b-sharded"
tokenizer = AutoTokenizer.from_pretrained(model_name)

model = AutoModelForCausalLM.from_pretrained(model_name, load_in_8bit=True)