Commit graph

67 commits

Author SHA1 Message Date
Pierre Chapuis b69dbc4e5c improve CrossAttentionAdapter test 2023-09-12 11:58:24 +02:00
Pierre Chapuis dc2c3e0163 implement CrossAttentionAdapter using chain operations 2023-09-12 11:58:24 +02:00
Pierre Chapuis 3c056e2231 expose lookup_top_adapter 2023-09-12 11:58:24 +02:00
Cédric Deltheil 32cba1afd8 test_sdxl_double_encoder: use proper weights 2023-09-11 21:49:24 +02:00
Pierre Chapuis 78e69c7da0 fix typo + skip test if weights are not available 2023-09-07 17:31:20 +02:00
Pierre Chapuis d9a461e9b5 stop relying on SDXL 0.9 weights in test 2023-09-07 12:18:38 +02:00
limiteinductive 88efa117bf fix model comparison with custom layers 2023-09-05 12:34:38 +02:00
Cédric Deltheil b933fabf31 unet: get rid of clip_embedding attribute for SD1
It is implicitly defined by the underlying cross-attention layer. This
also makes it consistent with SDXL.
2023-09-01 19:23:33 +02:00
Pierre Chapuis e91e31ebd2 check no two controlnets have the same name 2023-09-01 17:47:29 +02:00
Pierre Chapuis 73813310d0 rename SelfAttentionInjection to ReferenceOnlyControl and vice-versa 2023-09-01 17:29:48 +02:00
Pierre Chapuis 0f476ea18b make high-level adapters Adapters
This generalizes the Adapter abstraction to higher-level
constructs such as high-level LoRA (targeting e.g. the
SD UNet), ControlNet and Reference-Only Control.

Some adapters now work by adapting child models with
"sub-adapters" that they inject / eject when needed.
2023-08-31 10:57:18 +02:00
Pierre Chapuis 18c84c7b72 shorter import paths 2023-08-29 16:57:40 +02:00
limiteinductive 7ca6bd0ccd implement the ConvertModule class and refactor conversion scripts 2023-08-28 14:39:14 +02:00
limiteinductive 92a21bc21e refactor latent_diffusion module 2023-08-25 12:30:20 +02:00
Pierre Chapuis 2ad26a06b0 fix LoRAs on Self target 2023-08-23 12:13:01 +02:00
limiteinductive 3565a4127f implement DoubleTextEncoder for SDXL 2023-08-23 11:05:38 +02:00
Cédric Deltheil 48f674c433 initial commit 2023-08-04 15:28:41 +02:00