Commit
•
f459e40
1
Parent(s):
a4cb838
Update README.md
Browse files
README.md
CHANGED
@@ -12,11 +12,11 @@ Traditional deep learning often overlooks bytes, the basic units of the digital
|
|
12 |
|
13 |
We provide five weights of bGPT on [Hugging Face](https://huggingface.co/sander-wood/bgpt/tree/main) corresponding to each dataset used for pre-training:
|
14 |
|
15 |
-
1. **_weights-conversion.pth_**: bGPT pre-trained on IrishMAN for data conversion.
|
16 |
-
2. **_weights-cpu.pth_**: bGPT pre-trained on CPU states for CPU state modelling.
|
17 |
-
3. **_weights-text.pth_**: bGPT pre-trained on Wikipedia for text generation/classification.
|
18 |
-
4. **_weights-image.pth_**: bGPT pre-trained on ImageNet for image generation/classification.
|
19 |
-
5. **_weights-audio.pth_**: bGPT pre-trained on Librispeech for audio generation/classification.
|
20 |
|
21 |
The code for bGPT is available on [GitHub ](https://github.com/sanderwood/bgpt).
|
22 |
|
|
|
12 |
|
13 |
We provide five weights of bGPT on [Hugging Face](https://huggingface.co/sander-wood/bgpt/tree/main) corresponding to each dataset used for pre-training:
|
14 |
|
15 |
+
1. **_weights-conversion.pth_**: bGPT pre-trained on IrishMAN for data conversion (between `.abc` and `.mid`).
|
16 |
+
2. **_weights-cpu.pth_**: bGPT pre-trained on CPU states for CPU state modelling (`.bin`).
|
17 |
+
3. **_weights-text.pth_**: bGPT pre-trained on Wikipedia for text generation/classification (`.txt`).
|
18 |
+
4. **_weights-image.pth_**: bGPT pre-trained on ImageNet for image generation/classification (`.bmp`).
|
19 |
+
5. **_weights-audio.pth_**: bGPT pre-trained on Librispeech for audio generation/classification (`.wav`).
|
20 |
|
21 |
The code for bGPT is available on [GitHub ](https://github.com/sanderwood/bgpt).
|
22 |
|