Commit graph

59 commits

Author SHA1 Message Date
Cédric Deltheil b933fabf31 unet: get rid of clip_embedding attribute for SD1
It is implicitly defined by the underlying cross-attention layer. This
also makes it consistent with SDXL.
2023-09-01 19:23:33 +02:00
Pierre Chapuis 73813310d0 rename SelfAttentionInjection to ReferenceOnlyControl and vice-versa 2023-09-01 17:29:48 +02:00
Pierre Chapuis 0f476ea18b make high-level adapters Adapters
This generalizes the Adapter abstraction to higher-level
constructs such as high-level LoRA (targeting e.g. the
SD UNet), ControlNet and Reference-Only Control.

Some adapters now work by adapting child models with
"sub-adapters" that they inject / eject when needed.
2023-08-31 10:57:18 +02:00
Pierre Chapuis 18c84c7b72 shorter import paths 2023-08-29 16:57:40 +02:00
Doryan Kaced 3680f9d196 Add support for learned concepts e.g. via textual inversion 2023-08-28 10:37:39 +02:00
limiteinductive 92a21bc21e refactor latent_diffusion module 2023-08-25 12:30:20 +02:00
Pierre Chapuis 1065dfe10b add empty __init__.py files to make pytest happy
(otherwise it wants unique file basenames)
2023-08-23 17:49:59 +02:00
Pierre Chapuis 97b162d9a0 add InformativeDrawings
https://github.com/carolineec/informative-drawings

This is the preprocessor for the Lineart ControlNet.
2023-08-16 12:29:09 +02:00
Cédric Deltheil 48f674c433 initial commit 2023-08-04 15:28:41 +02:00