musicaudiopretrain commited on
Commit
05809ac
1 Parent(s): 9c38a0b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -2
README.md CHANGED
@@ -1,9 +1,17 @@
1
  ---
2
  license: mit
3
  inference: false
 
 
4
  ---
5
 
6
- # Introduction
 
 
 
 
 
 
7
 
8
  **MERT-v0** is a completely unsupervised model trained on 1000 hour music audios.
9
  Its architecture is similar to the [HuBERT model](https://huggingface.co/docs/transformers/model_doc/hubert), but it has been specifically designed for music through the use of specialized pre-training strategies.
@@ -11,7 +19,6 @@ It is SOTA-comparable on multiple MIR tasks even under probing settings, while k
11
  It outperforms Jukebox representation on GTZAN (genre classification) and GiantSteps (key classification) datasets.
12
  Larger models trained with more data are on the way.
13
 
14
- Noted: we also release a pre-trained MIR model [music2vec](https://huggingface.co/m-a-p/music2vec-v1/blob/main/README.md) before, which shares similar model structure but has weaker performance.
15
 
16
  ![Performance Comparison](mert.png)
17
 
 
1
  ---
2
  license: mit
3
  inference: false
4
+ tags:
5
+ - music
6
  ---
7
 
8
+ # Introduction to our series work
9
+
10
+ Our MAP pre-trained music model family:
11
+ - a pre-trained MIR model [music2vec](https://huggingface.co/m-a-p/music2vec-v1) before, which shares similar model structure but has weaker performance.
12
+ - a model trained with open-source music dataset [MERT-v0-public](https://huggingface.co/m-a-p/MERT-v0-public)
13
+
14
+ # Introduction to this model
15
 
16
  **MERT-v0** is a completely unsupervised model trained on 1000 hour music audios.
17
  Its architecture is similar to the [HuBERT model](https://huggingface.co/docs/transformers/model_doc/hubert), but it has been specifically designed for music through the use of specialized pre-training strategies.
 
19
  It outperforms Jukebox representation on GTZAN (genre classification) and GiantSteps (key classification) datasets.
20
  Larger models trained with more data are on the way.
21
 
 
22
 
23
  ![Performance Comparison](mert.png)
24