helenai's picture
Update README.md
b98691f
|
raw
history blame
763 Bytes
metadata
language:
  - en
tags:
  - openvino

Salesforce/codegen2-1B

This is the Salesforce/codegen2-1B model converted to OpenVINO, for accellerated inference.

An example of how to do inference on this model:

from transformers import AutoTokenizer
from optimum.intel.openvino import OVModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("helenai/Salesforce-codegen2-1B-ov")
model = OVModelForCausalLM.from_pretrained("helenai/Salesforce-codegen2-1B-ov")

text = "def hello_world():"
input_ids = tokenizer(text, return_tensors="pt").input_ids
generated_ids = model.generate(input_ids, max_length=128)
print(tokenizer.decode(generated_ids[0], skip_special_tokens=True))