File size: 555 Bytes
867b4f6 80354fd 867b4f6 80354fd f1aa2cd |
1 2 3 4 5 6 7 8 9 10 |
---
license: other
language:
- en
tags:
- art
---
llama2-piratelora-13b
This repo contains a Low-Rank Adapter (LoRA) for llama 2 13b (16 float) fit on a simple dataset comprised of thousands of pirate phrases, conversation pieces, and obscura. The purpose behind the generation of this lora was to determine whether enforcement of dialect and diction was possible throug the LoRa fine tuning method. Results were less than perfect, but the LoRa does seem to push the model to stick to maritime and nautical topics when spontaneously prompted to generate. |