File size: 2,742 Bytes
930a29c
 
 
a371ce8
 
3f4f2c5
3ef5e86
930a29c
 
 
 
99122c4
50f034f
99122c4
50f034f
99122c4
50f034f
c039932
 
04f1577
82c5a36
50f034f
 
9da46f9
 
50f034f
 
e3ca5f7
99122c4
 
50f034f
6fcf6a4
 
e251e23
 
 
fff28a2
e251e23
6fcf6a4
99122c4
50f034f
74cced7
 
50f034f
74cced7
3efca14
6fcf6a4
74cced7
 
50f034f
99122c4
50f034f
74cced7
 
50f034f
99122c4
50f034f
 
74cced7
 
 
 
9da46f9
 
99122c4
75a7169
9da46f9
75a7169
50f034f
75a7169
50f034f
99122c4
75a7169
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e3ca5f7
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
---
title: "VampNet: Music Generation with Masked Transformers"
emoji: 🤖
colorFrom: gray
colorTo: gray
sdk: gradio
sdk_version: 3.36.1
app_file: app.py
pinned: false
---

# VampNet

This repository contains recipes for training generative music models on top of the Lyrebird Audio Codec.

# Setting up

Requires Python 3.9 or later. 


install VampNet

```bash
git clone https://github.com/hugofloresgarcia/vampnet.git
pip install -e ./vampnet
```

## A note on argbind
This repository relies on [argbind](https://github.com/pseeth/argbind) to manage CLIs and config files. 
Config files are stored in the `conf/` folder. 

## Getting the Pretrained Models

### Licensing for Pretrained Models: 
The weights for the models are licensed [`CC BY-NC-SA 4.0`](https://creativecommons.org/licenses/by-nc-sa/4.0/deed.ml). Likewise, any VampNet models fine-tuned on the pretrained models are also licensed [`CC BY-NC-SA 4.0`](https://creativecommons.org/licenses/by-nc-sa/4.0/deed.ml).

Download the pretrained models from [this link](https://zenodo.org/record/8136545). Then, extract the models to the `models/` folder. 


# Usage

## Launching the Gradio Interface
You can launch a gradio UI to play with vampnet. 

```bash
python app.py --args.load conf/interface.yml --Interface.device cuda
```

# Training / Fine-tuning 

## Training a model

To train a model, run the following script: 

```bash
python scripts/exp/train.py --args.load conf/vampnet.yml --save_path /path/to/checkpoints
```

You can edit `conf/vampnet.yml` to change the dataset paths or any training hyperparameters. 

For coarse2fine models, you can use `conf/c2f.yml` as a starting configuration. 

See `python scripts/exp/train.py -h` for a list of options.

## Fine-tuning
To fine-tune a model, use the script in `scripts/exp/fine_tune.py` to generate 3 configuration files: `c2f.yml`, `coarse.yml`, and `interface.yml`. 
The first two are used to fine-tune the coarse and fine models, respectively. The last one is used to launch the gradio interface.

```bash
python scripts/exp/fine_tune.py "/path/to/audio1.mp3 /path/to/audio2/ /path/to/audio3.wav" <fine_tune_name>
```

This will create a folder under `conf/<fine_tune_name>/` with the 3 configuration files.

The save_paths will be set to `runs/<fine_tune_name>/coarse` and `runs/<fine_tune_name>/c2f`. 

launch the coarse job: 
```bash
python scripts/exp/train.py --args.load conf/<fine_tune_name>/coarse.yml 
```

this will save the coarse model to `runs/<fine_tune_name>/coarse/ckpt/best/`.

launch the c2f job: 
```bash
python  scripts/exp/train.py --args.load conf/<fine_tune_name>/c2f.yml 
```

launch the interface: 
```bash
python  demo.py --args.load conf/generated/<fine_tune_name>/interface.yml 
```