longlian commited on
Commit
652042a
1 Parent(s): d22dbba

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +14 -1
README.md CHANGED
@@ -5,4 +5,17 @@ tags:
5
  datasets:
6
  - imagenet-1k
7
  ---
8
- The models for [CrossMAE: Rethinking Patch Dependence for Masked Autoencoders](https://crossmae.github.io/).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5
  datasets:
6
  - imagenet-1k
7
  ---
8
+
9
+ ## CrossMAE: Rethinking Patch Dependence for Masked Autoencoders
10
+ by <a href="https://max-fu.github.io">Letian Fu*</a>, <a href="https://tonylian.com">Long Lian*</a>, <a href="https://renwang435.github.io">Renhao Wang</a>, <a href="https://bfshi.github.io">Baifeng Shi</a>, <a href="https://people.eecs.berkeley.edu/~xdwang">Xudong Wang</a>, <a href="https://www.adamyala.org">Adam Yala†</a>, <a href="https://people.eecs.berkeley.edu/~trevor">Trevor Darrell†</a>, <a href="https://people.eecs.berkeley.edu/~efros">Alexei A. Efros†</a>, <a href="https://goldberg.berkeley.edu">Ken Goldberg†</a> at UC Berkeley and UCSF
11
+
12
+ [[Paper](https://arxiv.org/abs/2401.14391)] | [[Project Page](https://crossmae.github.io/)] | [[Citation](#citation)]
13
+
14
+
15
+ <p align="center">
16
+ <img src="https://crossmae.github.io/crossmae2.jpg" width="800">
17
+ </p>
18
+
19
+ This repo has the models for [CrossMAE: Rethinking Patch Dependence for Masked Autoencoders](https://arxiv.org/abs/2401.14391).
20
+
21
+ Please take a look at the [GitHub repo](https://github.com/TonyLianLong/CrossMAE) to see instructions on pretraining, fine-tuning, and evaluation with these models.