--- license: bigcode-openrail-m --- ## LoRDCoder v0 13.2B Usage: ``` from transformers import AutoModelForCausalLM, AutoTokenizer device = "cuda" model = AutoModelForCausalLM.from_pretrained("nolanoAI/lordcoder-v0-13-2B", trust_remote_code=True).to(device) tokenizer = AutoTokenizer.from_pretrained("nolanoAI/lordcoder-v0-13-2B", trust_remote_code=True) inputs = {k: v.to(device) for k,v in tokenizer('# PyTorch CNN on MNIST\nimport torch\n', return_tensors='pt').items()} generated_ids = model.generate( **inputs, use_cache=True, max_new_tokens=500, temperature=0.1, top_p=0.95, do_sample=True, eos_token_id=tokenizer.eos_token_id, pad_token_id=tokenizer.eos_token_id, ) ```