--- datasets: - SirNeural/flan_v2 metrics: - perplexity tags: - flan - opt - peft --- ## FLAN-OPT-6.7b-LoRA OPT was first introduced in [Open Pre-trained Transformer Language Models](https://arxiv.org/abs/2205.01068) and first released in [metaseq's repository](https://github.com/facebookresearch/metaseq) on May 3rd 2022 by Meta AI. This model is [facebook/opt-6.7b](https://hf.co/facebook/opt-6.7b) finetuned with low-rank adapters (https://arxiv.org/abs/2106.09685) on the FLAN datasets (https://arxiv.org/pdf/2210.11416.pdf). Low-rank adapters (r=16) finetuned over 1.6m new tokens of a FLAN task mixture, with the start of each example cut off if it was too large to fit within a 256 token context. The model reaches a train ppl of 4.36 and an eval ppl of 4.32. ### Inference Example (Chain-of-Thought prompt): ```python # %pip install -qq transformers git+https://github.com/huggingface/peft accelerate bitsandbytes from peft import PeftModel, PeftConfig from transformers import AutoModelForCausalLM, AutoTokenizer peft_model_id = "crumb/FLAN-OPT-6.7b-LoRA" config = PeftConfig.from_pretrained(peft_model_id) model = AutoModelForCausalLM.from_pretrained(config.base_model_name_or_path, load_in_8bit=True, low_cpu_mem_usage=True, device_map='auto') model = PeftModel.from_pretrained(model, peft_model_id) tokenizer = AutoTokenizer.from_pretrained(config.base_model_name_or_path) import torch prompt = """ Q: Answer the following yes/no question by reasoning step-by-step. Could a dandelion suffer from hepatitis? A: Hepatitis only affects organisms with livers. Dandelions don’t have a liver. The answer is no. Q: Answer the following yes/no question by reasoning step-by-step. Can you write a whole Haiku in a single tweet? A: A haiku is a japanese three-line poem. That is short enough to fit in 280 characters. The answer is yes. Q: Answer the following yes/no question by reasoning step-by-step. Can you reach space with a Cessna? A: """.strip() inputs = tokenizer([prompt], return_tensors='pt') with torch.autocast("cuda", dtype=torch.float16): outputs = model.generate( input_ids=inputs.input_ids.cuda(), attention_mask=inputs.attention_mask.cuda(), max_new_tokens=32, top_p=0.95, temperature=0.5, do_sample=True ) print("\n".join(tokenizer.decode(outputs[0]).split("\n")[:prompt.count("\n")+1])) # A Cessna is a small plane. A small plane can't get into space. The answer is no. ```