--- library_name: peft license: apache-2.0 datasets: - smangrul/chat-instruct-mixer --- ## Training procedure Fine-tuned version of Falcon-180B using PEFT LoRA + DeepSpeed ZeRO3 + Flash Attention + Activation Checkpointing. Read the blog [Falcon 180B Finetuning using 🤗 PEFT and DeepSpeed]() for more information. ### Framework versions - PEFT 0.6.0.dev0