File size: 1,819 Bytes
d067844
 
 
 
 
 
 
 
 
6e419f6
d067844
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3566c30
d067844
3566c30
d067844
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
---
license: other
license_name: sample-code-license
license_link: LICENSE
library_name: ml-4m
---

# 4M: Massively Multimodal Masked Modeling

*David Mizrahi\*, Roman Bachmann\*, Oğuzhan Fatih Kar, Teresa Yeo, Mingfei Gao, Afshin Dehghan, Amir Zamir*

Official implementation and pre-trained models for "4M: Massively Multimodal Masked Modeling" (NeurIPS 2023).

[`Website`](https://4m.epfl.ch) | [`Paper`](https://arxiv.org/abs/2312.06647) | [`GitHub`](https://github.com/apple/ml-4m)

4M is a framework for training "any-to-any" foundation models, using tokenization and masking to scale to many diverse modalities. 
Models trained using 4M can perform a wide range of vision tasks, transfer well to unseen tasks and modalities, and are flexible and steerable multimodal generative models.


## Installation
For install instructions, please see https://github.com/apple/ml-4m. 


## Usage

The depth tokenizer can be loaded from Hugging Face Hub as follows:
```python
from fourm.vq.vqvae import DiVAE
tok_rgb = DiVAE.from_pretrained('EPFL-VILAB/4M_tokenizers_depth_8k_224-448')
```

Please see https://github.com/apple/ml-4m/README_TOKENIZATION.md for more detailed instructions and https://github.com/apple/ml-4m for other tokenizer and 4M model checkpoints.


## Citation

If you find this repository helpful, please consider citing our work:
```
@inproceedings{mizrahi20234m,
    title={{4M}: Massively Multimodal Masked Modeling},
    author={David Mizrahi and Roman Bachmann and O{\u{g}}uzhan Fatih Kar and Teresa Yeo and Mingfei Gao and Afshin Dehghan and Amir Zamir},
    booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
    year={2023},
}
```

## License

The model weights in this repository are released under the Sample Code license as found in the [LICENSE](LICENSE) file.