DISCLAIMER: I don't own the weights to this model, this is a property of Microsoft and taken from their official repository : microsoft/phi-2.
The sole purpose of this repository is to use this model through the transformers
API or to load and use the model using the HuggingFace transformers
library.
Usage
First make sure you have the latest version of the transformers
installed.
pip uninstall -y transformers && pip install git+https://github.com/huggingface/transformers
Then use the transformers library to load the model from the library itself
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("susnato/phi-2")
tokenizer = AutoTokenizer.from_pretrained("susnato/phi-2")
inputs = tokenizer('''def print_prime(n):
"""
Print all primes between 1 and n
"""''', return_tensors="pt", return_attention_mask=False)
outputs = model.generate(**inputs, max_length=200)
text = tokenizer.batch_decode(outputs)[0]
print(text)
- Downloads last month
- 1,551
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.