Datasets:

Modalities:
Image
Languages:
English
ArXiv:
Libraries:
Datasets
License:
Beom0 commited on
Commit
8678417
โ€ข
1 Parent(s): e28f143

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +73 -3
README.md CHANGED
@@ -1,3 +1,73 @@
1
- ---
2
- license: cc-by-nc-4.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-4.0
3
+ language:
4
+ - en
5
+ tags:
6
+ - matting
7
+ - segmentation
8
+ - segment anything
9
+ - zero-shot matting
10
+ ---
11
+
12
+ # Zero-Shot Image Matting for Anything
13
+
14
+ ## Introduction
15
+
16
+ ๐Ÿš€ Introducing ZIM: Zero-Shot Image Matting โ€“ A Step Beyond SAM! ๐Ÿš€
17
+
18
+ While SAM (Segment Anything Model) has redefined zero-shot segmentation with broad applications across multiple fields, it often falls short in delivering high-precision, fine-grained masks. Thatโ€™s where ZIM comes in.
19
+
20
+ ๐ŸŒŸ What is ZIM? ๐ŸŒŸ
21
+
22
+ ZIM (Zero-Shot Image Matting) is a groundbreaking model developed to set a new standard in precision matting while maintaining strong zero-shot capabilities. Like SAM, ZIM can generalize across diverse datasets and objects in a zero-shot paradigm. But ZIM goes beyond, delivering highly accurate, fine-grained masks that capture intricate details.
23
+
24
+ ๐Ÿ” Get Started with ZIM ๐Ÿ”
25
+
26
+ Ready to elevate your AI projects with unmatched matting quality? Access ZIM on our [project page](https://naver-ai.github.io/ZIM/), [Arxiv](https://huggingface.co/papers/2411.00626), and [Github](https://github.com/naver-ai/ZIM).
27
+
28
+ ## Installation
29
+
30
+ ```bash
31
+ pip install zim_anything
32
+ ```
33
+
34
+ or
35
+
36
+ ```bash
37
+ git clone https://github.com/naver-ai/ZIM.git
38
+ cd ZIM; pip install -e .
39
+ ```
40
+
41
+
42
+ ## Usage
43
+
44
+ 1. Make the directory `zim_vit_l_2092`.
45
+ 2. Download the [encoder](https://huggingface.co/naver-iv/zim-anything-vitl/resolve/main/zim_vit_l_2092/encoder.onnx?download=true) weight and [decoder](https://huggingface.co/naver-iv/zim-anything-vitl/resolve/main/zim_vit_l_2092/decoder.onnx?download=true) weight.
46
+ 3. Put them under the `zim_vit_b_2092` directory.
47
+
48
+ ```python
49
+ from zim_anything import zim_model_registry, ZimPredictor
50
+
51
+ backbone = "vit_l"
52
+ ckpt_p = "zim_vit_l_2092"
53
+
54
+ model = zim_model_registry[backbone](checkpoint=ckpt_p)
55
+ if torch.cuda.is_available():
56
+ model.cuda()
57
+
58
+ predictor = ZimPredictor(model)
59
+ predictor.set_image(<image>)
60
+ masks, _, _ = predictor.predict(<input_prompts>)
61
+ ```
62
+
63
+ ## Citation
64
+
65
+ If you find this project useful, please consider citing:
66
+
67
+ ```bibtex
68
+ @article{kim2024zim,
69
+ title={ZIM: Zero-Shot Image Matting for Anything},
70
+ author={Kim, Beomyoung and Shin, Chanyong and Jeong, Joonhyun and Jung, Hyungsik and Lee, Se-Yun and Chun, Sewhan and Hwang, Dong-Hyun and Yu, Joonsang},
71
+ journal={arXiv preprint arXiv:2411.00626},
72
+ year={2024}
73
+ }