Files changed (9) hide show
  1. .gitignore +10 -0
  2. LICENSE +201 -0
  3. README.md +84 -187
  4. build_docker.sh +31 -0
  5. main.py +4 -0
  6. publish.sh +10 -0
  7. requirements-dev.txt +2 -0
  8. requirements.txt +20 -0
  9. setup.py +46 -0
.gitignore ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ .DS_Store
2
+ **/__pycache__
3
+ examples/
4
+ .idea/
5
+ .vscode/
6
+ build
7
+ !lama_cleaner/app/build
8
+ dist/
9
+ lama_cleaner.egg-info/
10
+ venv/
LICENSE ADDED
@@ -0,0 +1,201 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Apache License
2
+ Version 2.0, January 2004
3
+ http://www.apache.org/licenses/
4
+
5
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6
+
7
+ 1. Definitions.
8
+
9
+ "License" shall mean the terms and conditions for use, reproduction,
10
+ and distribution as defined by Sections 1 through 9 of this document.
11
+
12
+ "Licensor" shall mean the copyright owner or entity authorized by
13
+ the copyright owner that is granting the License.
14
+
15
+ "Legal Entity" shall mean the union of the acting entity and all
16
+ other entities that control, are controlled by, or are under common
17
+ control with that entity. For the purposes of this definition,
18
+ "control" means (i) the power, direct or indirect, to cause the
19
+ direction or management of such entity, whether by contract or
20
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
21
+ outstanding shares, or (iii) beneficial ownership of such entity.
22
+
23
+ "You" (or "Your") shall mean an individual or Legal Entity
24
+ exercising permissions granted by this License.
25
+
26
+ "Source" form shall mean the preferred form for making modifications,
27
+ including but not limited to software source code, documentation
28
+ source, and configuration files.
29
+
30
+ "Object" form shall mean any form resulting from mechanical
31
+ transformation or translation of a Source form, including but
32
+ not limited to compiled object code, generated documentation,
33
+ and conversions to other media types.
34
+
35
+ "Work" shall mean the work of authorship, whether in Source or
36
+ Object form, made available under the License, as indicated by a
37
+ copyright notice that is included in or attached to the work
38
+ (an example is provided in the Appendix below).
39
+
40
+ "Derivative Works" shall mean any work, whether in Source or Object
41
+ form, that is based on (or derived from) the Work and for which the
42
+ editorial revisions, annotations, elaborations, or other modifications
43
+ represent, as a whole, an original work of authorship. For the purposes
44
+ of this License, Derivative Works shall not include works that remain
45
+ separable from, or merely link (or bind by name) to the interfaces of,
46
+ the Work and Derivative Works thereof.
47
+
48
+ "Contribution" shall mean any work of authorship, including
49
+ the original version of the Work and any modifications or additions
50
+ to that Work or Derivative Works thereof, that is intentionally
51
+ submitted to Licensor for inclusion in the Work by the copyright owner
52
+ or by an individual or Legal Entity authorized to submit on behalf of
53
+ the copyright owner. For the purposes of this definition, "submitted"
54
+ means any form of electronic, verbal, or written communication sent
55
+ to the Licensor or its representatives, including but not limited to
56
+ communication on electronic mailing lists, source code control systems,
57
+ and issue tracking systems that are managed by, or on behalf of, the
58
+ Licensor for the purpose of discussing and improving the Work, but
59
+ excluding communication that is conspicuously marked or otherwise
60
+ designated in writing by the copyright owner as "Not a Contribution."
61
+
62
+ "Contributor" shall mean Licensor and any individual or Legal Entity
63
+ on behalf of whom a Contribution has been received by Licensor and
64
+ subsequently incorporated within the Work.
65
+
66
+ 2. Grant of Copyright License. Subject to the terms and conditions of
67
+ this License, each Contributor hereby grants to You a perpetual,
68
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69
+ copyright license to reproduce, prepare Derivative Works of,
70
+ publicly display, publicly perform, sublicense, and distribute the
71
+ Work and such Derivative Works in Source or Object form.
72
+
73
+ 3. Grant of Patent License. Subject to the terms and conditions of
74
+ this License, each Contributor hereby grants to You a perpetual,
75
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76
+ (except as stated in this section) patent license to make, have made,
77
+ use, offer to sell, sell, import, and otherwise transfer the Work,
78
+ where such license applies only to those patent claims licensable
79
+ by such Contributor that are necessarily infringed by their
80
+ Contribution(s) alone or by combination of their Contribution(s)
81
+ with the Work to which such Contribution(s) was submitted. If You
82
+ institute patent litigation against any entity (including a
83
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
84
+ or a Contribution incorporated within the Work constitutes direct
85
+ or contributory patent infringement, then any patent licenses
86
+ granted to You under this License for that Work shall terminate
87
+ as of the date such litigation is filed.
88
+
89
+ 4. Redistribution. You may reproduce and distribute copies of the
90
+ Work or Derivative Works thereof in any medium, with or without
91
+ modifications, and in Source or Object form, provided that You
92
+ meet the following conditions:
93
+
94
+ (a) You must give any other recipients of the Work or
95
+ Derivative Works a copy of this License; and
96
+
97
+ (b) You must cause any modified files to carry prominent notices
98
+ stating that You changed the files; and
99
+
100
+ (c) You must retain, in the Source form of any Derivative Works
101
+ that You distribute, all copyright, patent, trademark, and
102
+ attribution notices from the Source form of the Work,
103
+ excluding those notices that do not pertain to any part of
104
+ the Derivative Works; and
105
+
106
+ (d) If the Work includes a "NOTICE" text file as part of its
107
+ distribution, then any Derivative Works that You distribute must
108
+ include a readable copy of the attribution notices contained
109
+ within such NOTICE file, excluding those notices that do not
110
+ pertain to any part of the Derivative Works, in at least one
111
+ of the following places: within a NOTICE text file distributed
112
+ as part of the Derivative Works; within the Source form or
113
+ documentation, if provided along with the Derivative Works; or,
114
+ within a display generated by the Derivative Works, if and
115
+ wherever such third-party notices normally appear. The contents
116
+ of the NOTICE file are for informational purposes only and
117
+ do not modify the License. You may add Your own attribution
118
+ notices within Derivative Works that You distribute, alongside
119
+ or as an addendum to the NOTICE text from the Work, provided
120
+ that such additional attribution notices cannot be construed
121
+ as modifying the License.
122
+
123
+ You may add Your own copyright statement to Your modifications and
124
+ may provide additional or different license terms and conditions
125
+ for use, reproduction, or distribution of Your modifications, or
126
+ for any such Derivative Works as a whole, provided Your use,
127
+ reproduction, and distribution of the Work otherwise complies with
128
+ the conditions stated in this License.
129
+
130
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
131
+ any Contribution intentionally submitted for inclusion in the Work
132
+ by You to the Licensor shall be under the terms and conditions of
133
+ this License, without any additional terms or conditions.
134
+ Notwithstanding the above, nothing herein shall supersede or modify
135
+ the terms of any separate license agreement you may have executed
136
+ with Licensor regarding such Contributions.
137
+
138
+ 6. Trademarks. This License does not grant permission to use the trade
139
+ names, trademarks, service marks, or product names of the Licensor,
140
+ except as required for reasonable and customary use in describing the
141
+ origin of the Work and reproducing the content of the NOTICE file.
142
+
143
+ 7. Disclaimer of Warranty. Unless required by applicable law or
144
+ agreed to in writing, Licensor provides the Work (and each
145
+ Contributor provides its Contributions) on an "AS IS" BASIS,
146
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147
+ implied, including, without limitation, any warranties or conditions
148
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149
+ PARTICULAR PURPOSE. You are solely responsible for determining the
150
+ appropriateness of using or redistributing the Work and assume any
151
+ risks associated with Your exercise of permissions under this License.
152
+
153
+ 8. Limitation of Liability. In no event and under no legal theory,
154
+ whether in tort (including negligence), contract, or otherwise,
155
+ unless required by applicable law (such as deliberate and grossly
156
+ negligent acts) or agreed to in writing, shall any Contributor be
157
+ liable to You for damages, including any direct, indirect, special,
158
+ incidental, or consequential damages of any character arising as a
159
+ result of this License or out of the use or inability to use the
160
+ Work (including but not limited to damages for loss of goodwill,
161
+ work stoppage, computer failure or malfunction, or any and all
162
+ other commercial damages or losses), even if such Contributor
163
+ has been advised of the possibility of such damages.
164
+
165
+ 9. Accepting Warranty or Additional Liability. While redistributing
166
+ the Work or Derivative Works thereof, You may choose to offer,
167
+ and charge a fee for, acceptance of support, warranty, indemnity,
168
+ or other liability obligations and/or rights consistent with this
169
+ License. However, in accepting such obligations, You may act only
170
+ on Your own behalf and on Your sole responsibility, not on behalf
171
+ of any other Contributor, and only if You agree to indemnify,
172
+ defend, and hold each Contributor harmless for any liability
173
+ incurred by, or claims asserted against, such Contributor by reason
174
+ of your accepting any such warranty or additional liability.
175
+
176
+ END OF TERMS AND CONDITIONS
177
+
178
+ APPENDIX: How to apply the Apache License to your work.
179
+
180
+ To apply the Apache License to your work, attach the following
181
+ boilerplate notice, with the fields enclosed by brackets "[]"
182
+ replaced with your own identifying information. (Don't include
183
+ the brackets!) The text should be enclosed in the appropriate
184
+ comment syntax for the file format. We also recommend that a
185
+ file or class name and description of purpose be included on the
186
+ same "printed page" as the copyright notice for easier
187
+ identification within third-party archives.
188
+
189
+ Copyright [yyyy] [name of copyright owner]
190
+
191
+ Licensed under the Apache License, Version 2.0 (the "License");
192
+ you may not use this file except in compliance with the License.
193
+ You may obtain a copy of the License at
194
+
195
+ http://www.apache.org/licenses/LICENSE-2.0
196
+
197
+ Unless required by applicable law or agreed to in writing, software
198
+ distributed under the License is distributed on an "AS IS" BASIS,
199
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
200
+ See the License for the specific language governing permissions and
201
+ limitations under the License.
README.md CHANGED
@@ -1,187 +1,84 @@
1
- ---
2
- license: other
3
- ---
4
-
5
- # OpenAssistant LLaMa 30B SFT 6
6
-
7
- Due to the license attached to LLaMa models by Meta AI it is not possible to directly distribute LLaMa-based models. Instead we provide XOR weights for the OA models.
8
-
9
- Thanks to Mick for writing the `xor_codec.py` script which enables this process
10
-
11
- ## The Process
12
-
13
- Note: This process applies to `oasst-sft-6-llama-30b` model. The same process can be applied to other models in future, but the checksums will be different..
14
-
15
- **This process is tested only on Linux (specifically Ubuntu). Some users have reported that the process does not work on Windows. We recommend using WSL if you only have a Windows machine.**
16
-
17
- To use OpenAssistant LLaMa-Based Models, you need to have a copy of the original LLaMa model weights and add them to a `llama` subdirectory here.
18
-
19
- Ensure your LLaMa 30B checkpoint matches the correct md5sums:
20
-
21
- ```
22
- f856e9d99c30855d6ead4d00cc3a5573 consolidated.00.pth
23
- d9dbfbea61309dc1e087f5081e98331a consolidated.01.pth
24
- 2b2bed47912ceb828c0a37aac4b99073 consolidated.02.pth
25
- ea0405cdb5bc638fee12de614f729ebc consolidated.03.pth
26
- 4babdbd05b8923226a9e9622492054b6 params.json
27
- ```
28
-
29
- **Important: Follow these exact steps to convert your original LLaMa checkpoint to a HuggingFace Transformers-compatible format. If you use the wrong versions of any dependency, you risk ending up with weights which are not compatible with the XOR files.**
30
-
31
- 1. Create a clean Python **3.10** virtual environment & activate it:
32
-
33
- ```
34
- python3.10 -m venv xor_venv
35
- source xor_venv/bin/activate
36
- ```
37
-
38
- 2. Clone transformers repo and switch to tested version:
39
-
40
- ```
41
- git clone https://github.com/huggingface/transformers.git
42
- cd transformers
43
- git checkout d04ec99bec8a0b432fc03ed60cea9a1a20ebaf3c
44
- pip install .
45
- ```
46
-
47
- 3. Install **exactly** these dependency versions:
48
-
49
- ```
50
- pip install torch==1.13.1 accelerate==0.18.0 sentencepiece==0.1.98 protobuf==3.20.1
51
- ```
52
-
53
- 4. Check `pip freeze` output:
54
-
55
- ```
56
- accelerate==0.18.0
57
- certifi==2022.12.7
58
- charset-normalizer==3.1.0
59
- filelock==3.12.0
60
- huggingface-hub==0.13.4
61
- idna==3.4
62
- numpy==1.24.2
63
- nvidia-cublas-cu11==11.10.3.66
64
- nvidia-cuda-nvrtc-cu11==11.7.99
65
- nvidia-cuda-runtime-cu11==11.7.99
66
- nvidia-cudnn-cu11==8.5.0.96
67
- packaging==23.1
68
- protobuf==3.20.1
69
- psutil==5.9.5
70
- PyYAML==6.0
71
- regex==2023.3.23
72
- requests==2.28.2
73
- sentencepiece==0.1.98
74
- tokenizers==0.13.3
75
- torch==1.13.1
76
- tqdm==4.65.0
77
- transformers @ file:///mnt/data/koepf/transformers
78
- typing_extensions==4.5.0
79
- urllib3==1.26.15
80
- ```
81
-
82
- 5. While in `transformers` repo root, run HF LLaMA conversion script:
83
-
84
- ```
85
- python src/transformers/models/llama/convert_llama_weights_to_hf.py --input_dir <input_path_llama_base> --output_dir <output_path_llama30b_hf> --model_size 30B
86
- ```
87
-
88
- 6. Run `find . -type f -exec md5sum "{}" +` in the conversion target directory (`output_dir`). This should produce exactly the following checksums if your files are correct:
89
-
90
- ```
91
- 462a2d07f65776f27c0facfa2affb9f9 ./pytorch_model-00007-of-00007.bin
92
- e1dc8c48a65279fb1fbccff14562e6a3 ./pytorch_model-00003-of-00007.bin
93
- 9cffb1aeba11b16da84b56abb773d099 ./pytorch_model-00001-of-00007.bin
94
- aee09e21813368c49baaece120125ae3 ./generation_config.json
95
- 92754d6c6f291819ffc3dfcaf470f541 ./pytorch_model-00005-of-00007.bin
96
- 3eddc6fc02c0172d38727e5826181adb ./pytorch_model-00004-of-00007.bin
97
- eeec4125e9c7560836b4873b6f8e3025 ./tokenizer.model
98
- 99762d59efa6b96599e863893cf2da02 ./pytorch_model-00006-of-00007.bin
99
- 598538f18fed1877b41f77de034c0c8a ./config.json
100
- fdb311c39b8659a5d5c1991339bafc09 ./tokenizer.json
101
- fecfda4fba7bfd911e187a85db5fa2ef ./pytorch_model.bin.index.json
102
- edd1a5897748864768b1fab645b31491 ./tokenizer_config.json
103
- 6b2e0a735969660e720c27061ef3f3d3 ./special_tokens_map.json
104
- 5cfcb78b908ffa02e681cce69dbe4303 ./pytorch_model-00002-of-00007.bin
105
- ```
106
-
107
- **Important: You should now have the correct LLaMa weights and be ready to apply the XORs. If the checksums above do not match yours, there is a problem.**
108
-
109
- 7. Once you have LLaMa weights in the correct format, you can apply the XOR decoding:
110
-
111
- ```
112
- python xor_codec.py oasst-sft-6-llama-30b/ oasst-sft-6-llama-30b-xor/oasst-sft-6-llama-30b-xor/ llama30b_hf/
113
- ```
114
-
115
- You should **expect to see one warning message** during execution:
116
-
117
- `Exception when processing 'added_tokens.json'`
118
-
119
- This is normal. **If similar messages appear for other files, something has gone wrong**.
120
-
121
- 8. Now run `find . -type f -exec md5sum "{}" +` in the output directory (here `oasst-sft-6-llama-30b`). You should get a file with exactly these checksums:
122
-
123
- ```
124
- 970e99665d66ba3fad6fdf9b4910acc5 ./pytorch_model-00007-of-00007.bin
125
- 659fcb7598dcd22e7d008189ecb2bb42 ./pytorch_model-00003-of-00007.bin
126
- ff6e4cf43ddf02fb5d3960f850af1220 ./pytorch_model-00001-of-00007.bin
127
- 27b0dc092f99aa2efaf467b2d8026c3f ./added_tokens.json
128
- 2917a1cafb895cf57e746cfd7696bfe5 ./generation_config.json
129
- 740c324ae65b1ec25976643cda79e479 ./pytorch_model-00005-of-00007.bin
130
- f7aefb4c63be2ac512fd905b45295235 ./pytorch_model-00004-of-00007.bin
131
- eeec4125e9c7560836b4873b6f8e3025 ./tokenizer.model
132
- 369df2f0e38bda0d9629a12a77c10dfc ./pytorch_model-00006-of-00007.bin
133
- cc9dbf56b68b68a585cc7367696e06a7 ./config.json
134
- 76d47e4f51a8df1d703c6f594981fcab ./pytorch_model.bin.index.json
135
- fd9452959d711be29ccf04a97598e8d1 ./tokenizer_config.json
136
- 785905630a0fe583122a8446a5abe287 ./special_tokens_map.json
137
- ae48c4c68e4e171d502dd0896aa19a84 ./pytorch_model-00002-of-00007.bin
138
- ```
139
-
140
- If so you have successfully decoded the weights and should be able to use the model with HuggingFace Transformers. **If your checksums do not match those above, there is a problem.**
141
-
142
- ### Configuration
143
-
144
- ```
145
- llama-30b-sft-6:
146
- dtype: fp16
147
- log_dir: "llama_log_30b"
148
- learning_rate: 1e-5
149
- model_name: /home/ubuntu/Open-Assistant/model/model_training/.saved/llama-30b-super-pretrain/checkpoint-3500
150
- output_dir: llama_model_30b
151
- deepspeed_config: configs/zero3_config_sft.json
152
- weight_decay: 0.0
153
- residual_dropout: 0.0
154
- max_length: 2048
155
- use_flash_attention: true
156
- warmup_steps: 20
157
- gradient_checkpointing: true
158
- gradient_accumulation_steps: 16
159
- per_device_train_batch_size: 2
160
- per_device_eval_batch_size: 3
161
- eval_steps: 101
162
- save_steps: 292
163
- num_train_epochs: 8
164
- save_total_limit: 3
165
- use_custom_sampler: true
166
- sort_by_length: false
167
- save_strategy: steps
168
- datasets:
169
- - oasst_export:
170
- lang: "bg,ca,cs,da,de,en,es,fr,hr,hu,it,nl,pl,pt,ro,ru,sl,sr,sv,uk"
171
- input_file_path: 2023-04-12_oasst_release_ready_synth.jsonl.gz
172
- val_split: 0.05
173
- - vicuna:
174
- val_split: 0.05
175
- max_val_set: 800
176
- fraction: 0.8
177
- - dolly15k:
178
- val_split: 0.05
179
- max_val_set: 300
180
- - grade_school_math_instructions:
181
- val_split: 0.05
182
- - code_alpaca:
183
- val_split: 0.05
184
- max_val_set: 250
185
- ```
186
-
187
- - **OASST dataset paper:** https://arxiv.org/abs/2304.07327
 
1
+ <h1 align="center">Lama Cleaner</h1>
2
+ <p align="center">A free and open-source inpainting tool powered by SOTA AI model.</p>
3
+
4
+ <p align="center">
5
+ <a href="https://github.com/Sanster/lama-cleaner">
6
+ <img alt="total download" src="https://pepy.tech/badge/lama-cleaner" />
7
+ </a>
8
+ <a href="https://pypi.org/project/lama-cleaner/">
9
+ <img alt="version" src="https://img.shields.io/pypi/v/lama-cleaner" />
10
+ </a>
11
+ <a href="https://colab.research.google.com/drive/1e3ZkAJxvkK3uzaTGu91N9TvI_Mahs0Wb?usp=sharing">
12
+ <img alt="Open in Colab" src="https://colab.research.google.com/assets/colab-badge.svg" />
13
+ </a>
14
+
15
+ <a href="https://huggingface.co/spaces/Sanster/Lama-Cleaner-lama">
16
+ <img alt="Hugging Face Spaces" src="https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue" />
17
+ </a>
18
+
19
+ <a href="">
20
+ <img alt="python version" src="https://img.shields.io/pypi/pyversions/lama-cleaner" />
21
+ </a>
22
+ <a href="https://hub.docker.com/r/cwq1913/lama-cleaner">
23
+ <img alt="version" src="https://img.shields.io/docker/pulls/cwq1913/lama-cleaner" />
24
+ </a>
25
+ </p>
26
+
27
+ https://user-images.githubusercontent.com/3998421/196976498-ba1ad3ab-fa18-4c55-965f-5c6683141375.mp4
28
+
29
+ ## Sponsor
30
+
31
+ <table>
32
+ <tr>
33
+ <td >
34
+ <img src="./assets/GitHub_Copilot_logo.svg" style="background: white;padding: 8px;"/>
35
+ </td>
36
+ <td >
37
+ <a href="https://ko-fi.com/Z8Z1CZJGY/tiers" target="_blank" >
38
+ ❤️ Your logo
39
+ </a>
40
+ </td>
41
+ </tr>
42
+ </table>
43
+
44
+ ## Features
45
+
46
+ - Completely free and open-source, fully self-hosted, support CPU & GPU & M1/2
47
+ - [Windows 1-Click Installer](https://lama-cleaner-docs.vercel.app/install/windows_1click_installer)
48
+ - Multiple SOTA AI [models](https://lama-cleaner-docs.vercel.app/models)
49
+ - Erase model: LaMa/LDM/ZITS/MAT/FcF/Manga
50
+ - Erase and Replace model: Stable Diffusion/Paint by Example
51
+ - [Plugins](https://lama-cleaner-docs.vercel.app/plugins) for post-processing:
52
+ - [RemoveBG](https://github.com/danielgatis/rembg): Remove images background
53
+ - [RealESRGAN](https://github.com/xinntao/Real-ESRGAN): Super Resolution
54
+ - [GFPGAN](https://github.com/TencentARC/GFPGAN): Face Restoration
55
+ - [RestoreFormer](https://github.com/wzhouxiff/RestoreFormer): Face Restoration
56
+ - More features at [lama-cleaner-docs](https://lama-cleaner-docs.vercel.app/)
57
+
58
+ ## Quick Start
59
+
60
+ Lama Cleaner make it easy to use SOTA AI model in just two commands:
61
+
62
+ ```bash
63
+ # In order to use the GPU, install cuda version of pytorch first.
64
+ # pip install torch==1.13.1+cu117 --extra-index-url https://download.pytorch.org/whl/cu117
65
+ pip install lama-cleaner
66
+ lama-cleaner --model=lama --device=cpu --port=8080
67
+ ```
68
+
69
+ That's it, Lama Cleaner is now running at http://localhost:8080
70
+
71
+ See all command line arguments at [lama-cleaner-docs](https://lama-cleaner-docs.vercel.app/install/pip)
72
+
73
+ ## Development
74
+
75
+ Only needed if you plan to modify the frontend and recompile yourself.
76
+
77
+ ### Frontend
78
+
79
+ Frontend code are modified from [cleanup.pictures](https://github.com/initml/cleanup.pictures), You can experience their
80
+ great online services [here](https://cleanup.pictures/).
81
+
82
+ - Install dependencies:`cd lama_cleaner/app/ && pnpm install`
83
+ - Start development server: `pnpm start`
84
+ - Build: `pnpm build`
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
build_docker.sh ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env bash
2
+ set -e
3
+
4
+ GIT_TAG=$1
5
+ IMAGE_DESC="Image inpainting tool powered by SOTA AI Model"
6
+ GIT_REPO="https://github.com/Sanster/lama-cleaner"
7
+
8
+ echo "Building cpu docker image..."
9
+
10
+ docker buildx build \
11
+ --file ./docker/CPUDockerfile \
12
+ --label org.opencontainers.image.title=lama-cleaner \
13
+ --label org.opencontainers.image.description="$IMAGE_DESC" \
14
+ --label org.opencontainers.image.url=$GIT_REPO \
15
+ --label org.opencontainers.image.source=$GIT_REPO \
16
+ --label org.opencontainers.image.version=$GIT_TAG \
17
+ --build-arg version=$GIT_TAG \
18
+ --tag cwq1913/lama-cleaner:cpu-$GIT_TAG .
19
+
20
+
21
+ echo "Building NVIDIA GPU docker image..."
22
+
23
+ docker buildx build \
24
+ --file ./docker/GPUDockerfile \
25
+ --label org.opencontainers.image.title=lama-cleaner \
26
+ --label org.opencontainers.image.description="$IMAGE_DESC" \
27
+ --label org.opencontainers.image.url=$GIT_REPO \
28
+ --label org.opencontainers.image.source=$GIT_REPO \
29
+ --label org.opencontainers.image.version=$GIT_TAG \
30
+ --build-arg version=$GIT_TAG \
31
+ --tag cwq1913/lama-cleaner:gpu-$GIT_TAG .
main.py ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ from lama_cleaner import entry_point
2
+
3
+ if __name__ == "__main__":
4
+ entry_point()
publish.sh ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env bash
2
+ set -e
3
+
4
+ pushd ./lama_cleaner/app
5
+ yarn run build
6
+ popd
7
+
8
+ rm -r -f dist
9
+ python3 setup.py sdist bdist_wheel
10
+ twine upload dist/*
requirements-dev.txt ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ wheel
2
+ twine
requirements.txt ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ torch>=1.9.0
2
+ opencv-python
3
+ flask_cors
4
+ Jinja2==2.11.3
5
+ flask==1.1.4
6
+ flaskwebgui==0.3.5
7
+ tqdm
8
+ pydantic
9
+ rich
10
+ loguru
11
+ pytest
12
+ yacs
13
+ markupsafe==2.0.1
14
+ scikit-image==0.19.3
15
+ diffusers[torch]==0.14.0
16
+ transformers==4.27.4
17
+ gradio
18
+ piexif==1.1.3
19
+ safetensors
20
+ omegaconf
setup.py ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import setuptools
2
+ from pathlib import Path
3
+
4
+ web_files = Path("lama_cleaner/app/build/").glob("**/*")
5
+ web_files = [str(it).replace("lama_cleaner/", "") for it in web_files]
6
+
7
+ with open("README.md", "r", encoding="utf-8") as fh:
8
+ long_description = fh.read()
9
+
10
+
11
+ def load_requirements():
12
+ requirements_file_name = "requirements.txt"
13
+ requires = []
14
+ with open(requirements_file_name) as f:
15
+ for line in f:
16
+ if line:
17
+ requires.append(line.strip())
18
+ return requires
19
+
20
+
21
+ # https://setuptools.readthedocs.io/en/latest/setuptools.html#including-data-files
22
+ setuptools.setup(
23
+ name="lama-cleaner",
24
+ version="1.1.1",
25
+ author="PanicByte",
26
+ author_email="cwq1913@gmail.com",
27
+ description="Image inpainting tool powered by SOTA AI Model",
28
+ long_description=long_description,
29
+ long_description_content_type="text/markdown",
30
+ url="https://github.com/Sanster/lama-cleaner",
31
+ packages=setuptools.find_packages("./"),
32
+ package_data={"lama_cleaner": web_files},
33
+ install_requires=load_requirements(),
34
+ python_requires=">=3.7",
35
+ entry_points={"console_scripts": ["lama-cleaner=lama_cleaner:entry_point"]},
36
+ classifiers=[
37
+ "License :: OSI Approved :: Apache Software License",
38
+ "Operating System :: OS Independent",
39
+ "Programming Language :: Python :: 3",
40
+ "Programming Language :: Python :: 3.7",
41
+ "Programming Language :: Python :: 3.8",
42
+ "Programming Language :: Python :: 3.9",
43
+ "Programming Language :: Python :: 3.10",
44
+ "Topic :: Scientific/Engineering :: Artificial Intelligence",
45
+ ],
46
+ )