GPT-2 pre-trained with abc music

The gpt2-abc model is built upon the GPT-2 architecture and has been trained on the Irishman dataset using zero-shot learning. Specifically tailored for music generation tasks based on ABC single-part musical scores, this model exhibits a unique capability to comprehend and generate music content adhering to the ABC notation. ABC notation is a simple yet powerful representation for musical compositions, commonly utilized for composing and sharing musical works. Leveraging the robust natural language processing and generation capabilities of GPT-2, the gpt2-abc model proves effective in understanding and creating music adhering to the specifications of ABC notation. This positions the model as a potent tool for the composition and generation of diverse musical pieces, offering an innovative approach for music creators.

Maintenance

GIT_LFS_SKIP_SMUDGE=1 git clone git@hf.co:MuGeminorum/gpt2-abc

Usage

from transformers import undefined

model = undefined.from_pretrained("undefined")
model.load_adapter("MuGeminorum/gpt2-abc", source="hf")

Reference

[1] https://huggingface.co/datasets/sander-wood/irishman

Downloads last month
0

Dataset used to train MuGeminorum/gpt2-abc

Collection including MuGeminorum/gpt2-abc