--- title: Augmented Poetry emoji: ⚡ colorFrom: red colorTo: gray sdk: gradio sdk_version: 3.4 app_file: app.py pinned: false --- - 1. fine-tune a large language model (LLM) using the text corpus of a specific poet - select a certain rhyme from the gutenberg corpus and fine-tune on this - try fine-tuning on a few lines of a poem that Eva has started run in a docker container and transfer to another machine Is it better to have a sequence to sequence transformer trained on sucessive lines of the poetry corpus?? https://github.com/huggingface/transformers/tree/main/examples/pytorch/language-modeling merve/poetry only has 573 rows. TODO: - upload the gutenberg poetry corpus up to huggingface - ask the lady who made it ## Research implement language generation with a basic transformer Gutenberg Poetry Autocomplete, a search engine-like interface for writing poems mined from Project Gutenberg. (A poem written using this interface was recently published in the Indianapolis Review!) https://ymeadows.com/en-articles/fine-tuning-transformer-based-language-models https://thegradient.pub/prompting/ https://towardsdatascience.com/fine-tuning-for-domain-adaptation-in-nlp-c47def356fd6 https://ruder.io/recent-advances-lm-fine-tuning/ https://streamlit.io/