--- license: mit --- ## Quick start ```python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("npvinHnivqn/phi-1_5-CRL-v0.2", trust_remote_code=True) tokenizer = AutoTokenizer.from_pretrained("npvinHnivqn/phi-1_5-CRL-v0.2", trust_remote_code=True) inputs = tokenizer('''<|USER|> Write a paragragh about animal''', return_tensors="pt", return_attention_mask=False) outputs = model.generate(**inputs, max_length=200) text = tokenizer.batch_decode(outputs)[0] print(text) ```