Edit model card

databricks/dolly-v2-3b

This is the databricks/dolly-v2-3b model converted to OpenVINO, for accellerated inference.

An example of how to do inference on this model:

from optimum.intel.openvino import OVModelForCausalLM
from transformers import AutoTokenizer, pipeline

# model_id should be set to either a local directory or a model available on the HuggingFace hub.
model_id = "katuni4ka/dolly-v2-3b-ov"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = OVModelForCausalLM.from_pretrained(model_id)
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)
result = pipe("hello world")
print(result)

More detailed example how to use model in instruction following scenario, can be found in this notebook

Downloads last month
10
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train katuni4ka/dolly-v2-3b-ov