h1t
/

Text-to-Image
Diffusers
lora
h1t commited on
Commit
ea69e29
1 Parent(s): 76af2eb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +24 -7
README.md CHANGED
@@ -28,20 +28,29 @@ We shall proceed to elucidate the situation here.
28
 
29
  <blockquote class="twitter-tweet"><p lang="en" dir="ltr">We regret to hear about the serious accusations from the CTM team <a href="https://twitter.com/gimdong58085414?ref_src=twsrc%5Etfw">@gimdong58085414</a>. I shall proceed to elucidate the situation and make an archive here. We already have several rounds of communication with CTM&#39;s authors. <a href="https://t.co/BKn3w1jXuh">https://t.co/BKn3w1jXuh</a></p>&mdash; Michael (@Merci0318) <a href="https://twitter.com/Merci0318/status/1772502247563559014?ref_src=twsrc%5Etfw">March 26, 2024</a></blockquote>
30
 
31
- 1. In our first arXiv pre-print, we have indicated "mainly borrows the proof from CTM" and have never intended to claim credits. As we have mentioned in our email, we would like to extend a formal apology to the CTM authors for the clearly inadequate level of referencing in our paper.
 
32
 
33
- 2. Our entire sampling algorithm and the whole proof of Theorem 4 are predicated upon DPMSolver and DEIS and we also provide the proof in the email.
34
-
35
- 3. CTM and TCD are different from motivation, method to experiments. The experimental results also cannot be obtained from any type of CTM algorithm.
 
 
 
 
 
 
 
 
 
36
 
37
- 3.1 Here we provide a simple method to check: use our sampler here to sample the checkpoint [CTM released](https://github.com/sony/ctm), or vice versa.
38
 
39
- 3.2 [CTM](https://github.com/sony/ctm) also provided training script. We welcome anyone to reproduce the experiments on SDXL based on CTM algorithm.
40
 
41
  We believe the assertion of plagiarism is not only severe but also detrimental to the academic integrity of the involved parties.
42
  We earnestly hope that everyone involved gains a more comprehensive understanding of this matter.
43
 
44
- All related docs can be found [here](https://drive.google.com/file/d/19c1QMfOMgp3McR4FCBk4pjdf22avyp8X/view).
45
 
46
  ## Introduction
47
 
@@ -388,6 +397,14 @@ grid_image = make_image_grid([ref_image, image], rows=1, cols=2)
388
  ![](./assets/ip_adapter.png)
389
 
390
 
 
 
 
 
 
 
 
 
391
  ## Citation
392
  ```bibtex
393
  @misc{zheng2024trajectory,
 
28
 
29
  <blockquote class="twitter-tweet"><p lang="en" dir="ltr">We regret to hear about the serious accusations from the CTM team <a href="https://twitter.com/gimdong58085414?ref_src=twsrc%5Etfw">@gimdong58085414</a>. I shall proceed to elucidate the situation and make an archive here. We already have several rounds of communication with CTM&#39;s authors. <a href="https://t.co/BKn3w1jXuh">https://t.co/BKn3w1jXuh</a></p>&mdash; Michael (@Merci0318) <a href="https://twitter.com/Merci0318/status/1772502247563559014?ref_src=twsrc%5Etfw">March 26, 2024</a></blockquote>
30
 
31
+ 1. In the [first arXiv version](https://arxiv.org/abs/2402.19159v1), we have provided citations and discussion in A. Related Works:
32
+ > Kim et al. (2023) proposes a universal framework for CMs and DMs. The core design is similar to ours, with the main differences being that we focus on reducing error in CMs, subtly leverage the semi-linear structure of the PF ODE for parameterization, and avoid the need for adversarial training.
33
 
34
+ 2. In the [first arXiv version](https://arxiv.org/abs/2402.19159v1), we have indicated in D.3 Proof of Theorem 4.2
35
+ > In this section, our derivation mainly borrows the proof from (Kim et al., 2023; Chen et al., 2022).
36
+
37
+ and we have never intended to claim credits.
38
+
39
+ As we have mentioned in our email, we would like to extend a formal apology to the CTM authors for the clearly inadequate level of referencing in our paper. We will provide more credits in the revised manuscript.
40
+
41
+ 3. In the updated [second arXiv version](https://arxiv.org/abs/2402.19159v2), we have expanded our discussion to elucidate the relationship with the CTM framework. Additionally, we have removed some proofs that were previously included for completeness.
42
+
43
+ 4. CTM and TCD are different from motivation, method to experiments. TCD is founded on the principles of the Latent Consistency Model (LCM), aimed to design an effective consistency function by utilizing the **exponential integrators**.
44
+
45
+ 5. The experimental results also cannot be obtained from any type of CTM algorithm.
46
 
47
+ 5.1 Here we provide a simple method to check: use our sampler here to sample the checkpoint [CTM released](https://github.com/sony/ctm), or vice versa.
48
 
49
+ 5.2 [CTM](https://github.com/sony/ctm) also provided training script. We welcome anyone to reproduce the experiments on SDXL or LDM based on CTM algorithm.
50
 
51
  We believe the assertion of plagiarism is not only severe but also detrimental to the academic integrity of the involved parties.
52
  We earnestly hope that everyone involved gains a more comprehensive understanding of this matter.
53
 
 
54
 
55
  ## Introduction
56
 
 
397
  ![](./assets/ip_adapter.png)
398
 
399
 
400
+ ## Related and Concurrent Works
401
+ - Luo S, Tan Y, Huang L, et al. Latent consistency models: Synthesizing high-resolution images with few-step inference. arXiv preprint arXiv:2310.04378, 2023.
402
+ - Luo S, Tan Y, Patil S, et al. LCM-LoRA: A universal stable-diffusion acceleration module. arXiv preprint arXiv:2311.05556, 2023.
403
+ - Lu C, Zhou Y, Bao F, et al. DPM-Solver: A fast ode solver for diffusion probabilistic model sampling in around 10 steps. Advances in Neural Information Processing Systems, 2022, 35: 5775-5787.
404
+ - Lu C, Zhou Y, Bao F, et al. DPM-solver++: Fast solver for guided sampling of diffusion probabilistic models. arXiv preprint arXiv:2211.01095, 2022.
405
+ - Zhang Q, Chen Y. Fast sampling of diffusion models with exponential integrator. ICLR 2023, Kigali, Rwanda, May 1-5, 2023.
406
+ - Kim D, Lai C H, Liao W H, et al. Consistency Trajectory Models: Learning Probability Flow ODE Trajectory of Diffusion. ICLR 2024.
407
+
408
  ## Citation
409
  ```bibtex
410
  @misc{zheng2024trajectory,