The combination of lcm-lora with other LoRa seems to diminish the effectiveness of LoRa.
#3
by
zz-via
- opened
I fused the lcm-lora into the model in Civitai, then fused my own trained LoRa. There is a noticeable difference in the effect of fusing my own LoRa directly on Civitai compared to fusing it after lcm-lora. The original direct fusion of my own LoRa has a more distinct style, while the difference in styles between different LoRas seems to decrease after fusing with lcm-lora. Is this due to the combination of multiple LoRas, or is it a problem with lcm itself?
zz-via
changed discussion title from
似乎和其他lora结合使用会降低lora的效果
to The combination of lcm-lora with other LoRa seems to diminish the effectiveness of LoRa.