Commit graph

630 commits

Author SHA1 Message Date
Pierre Chapuis d54a38ae07 do not hardcode a CUDA device in tests 2023-09-06 19:33:48 +02:00
Cédric Deltheil c55917e293 add IP-Adapter support for SD 1.5
Official repo: https://github.com/tencent-ailab/IP-Adapter
2023-09-06 15:12:48 +02:00
Cédric Deltheil d4dd45fd4d use Module's load_from_safetensors
Instead of manual calls to load_state_dict
2023-09-06 15:06:51 +02:00
Pierre Chapuis 4388968ad3 Update tests/e2e/test_diffusion.py
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2023-09-06 11:49:55 +02:00
Pierre Chapuis 547a73e67a clarify the "adapting when a LoRA is injected" issue in tests 2023-09-06 11:49:55 +02:00
Pierre Chapuis 864937a776 support injecting several LoRAs simultaneously 2023-09-06 11:49:55 +02:00
limiteinductive 88efa117bf fix model comparison with custom layers 2023-09-05 12:34:38 +02:00
Pierre Chapuis 7651daa01f update checkout action 2023-09-04 15:54:10 +02:00
Pierre Chapuis 566656a539 fix text encoder LoRAs 2023-09-04 15:51:39 +02:00
limiteinductive ebfa51f662 Make breakpoint a ContextModule 2023-09-04 12:22:10 +02:00
limiteinductive 5327d894d1 bump pyright version to 1.1.135 in poetry.lock 2023-09-04 10:41:06 +02:00
limiteinductive 9d2fbf6dbd Fix tuple annotation for pyright 1.1.325 2023-09-04 10:41:06 +02:00
Doryan Kaced 44e184d4d5 Init dtype and device correctly for OutputBlock 2023-09-01 19:44:06 +02:00
Cédric Deltheil 3a10baa9f8 cross-attn 2d: record use_bias attribute 2023-09-01 19:23:33 +02:00
Cédric Deltheil b933fabf31 unet: get rid of clip_embedding attribute for SD1
It is implicitly defined by the underlying cross-attention layer. This
also makes it consistent with SDXL.
2023-09-01 19:23:33 +02:00
Cédric Deltheil 134ee7b754 sdxl: remove wrong structural_attrs in cross-attn 2023-09-01 19:23:33 +02:00
Pierre Chapuis e91e31ebd2 check no two controlnets have the same name 2023-09-01 17:47:29 +02:00
Pierre Chapuis bd59790e08 always respect _can_refresh_parent 2023-09-01 17:44:16 +02:00
Pierre Chapuis d389d11a06 make basic adapters a part of Fluxion 2023-09-01 17:29:48 +02:00
Pierre Chapuis 31785f2059 scope range adapter in latent diffusion 2023-09-01 17:29:48 +02:00
Pierre Chapuis 73813310d0 rename SelfAttentionInjection to ReferenceOnlyControl and vice-versa 2023-09-01 17:29:48 +02:00
Pierre Chapuis eba0c33001 allow lora_targets to take a list of targets as input 2023-09-01 11:52:39 +02:00
Cédric Deltheil 92cdf19eae add Distribute to fluxion layers's __init__.py 2023-09-01 11:20:48 +02:00
Pierre Chapuis 9cf622a6e2 fix LoRA training script 2023-09-01 10:26:27 +02:00
Doryan Kaced 9f6733de8e Add concepts learning via textual inversion 2023-08-31 16:07:53 +02:00
Pierre Chapuis 0f476ea18b make high-level adapters Adapters
This generalizes the Adapter abstraction to higher-level
constructs such as high-level LoRA (targeting e.g. the
SD UNet), ControlNet and Reference-Only Control.

Some adapters now work by adapting child models with
"sub-adapters" that they inject / eject when needed.
2023-08-31 10:57:18 +02:00
Cédric Deltheil 7dc2e93cff tests: add test for clip image encoder
This covers a CLIPImageEncoderH model (Stable Diffusion v2-1-unclip)
specifically
2023-08-30 21:50:01 +02:00
Cédric Deltheil 3746d7f622 scripts: add converter for clip image encoder
Tested with:

    python scripts/conversion/convert_transformers_clip_image_model.py \
      \ --from /path/to/stabilityai/stable-diffusion-2-1-unclip
2023-08-30 21:50:01 +02:00
Cédric Deltheil d8004718c8 foundationals: add clip image encoder 2023-08-30 21:50:01 +02:00
Pierre Chapuis 32c1cfdbb1 add black to CI 2023-08-30 14:50:03 +02:00
Doryan Kaced 08a5341452 Make image resize configurable in training scripts 2023-08-30 14:05:29 +02:00
Doryan Kaced 437fa24368 Make horizontal flipping parametrable in training scripts 2023-08-30 12:41:03 +02:00
Pierre Chapuis 18c84c7b72 shorter import paths 2023-08-29 16:57:40 +02:00
limiteinductive 8615dbdbde Add inner_dim Parameter to Attention Layer in Fluxion 2023-08-28 16:34:25 +02:00
limiteinductive 7ca6bd0ccd implement the ConvertModule class and refactor conversion scripts 2023-08-28 14:39:14 +02:00
Doryan Kaced 3680f9d196 Add support for learned concepts e.g. via textual inversion 2023-08-28 10:37:39 +02:00
Benjamin Trom 8b1719b1f9 remove unused TextEncoder and UNet protocols 2023-08-25 17:34:26 +02:00
limiteinductive a5f70b6d22 add .env to .gitignore 2023-08-25 16:37:50 +02:00
limiteinductive 92a21bc21e refactor latent_diffusion module 2023-08-25 12:30:20 +02:00
Pierre Chapuis 3ee0ccccdc update poetry 2023-08-24 19:08:48 +02:00
Pierre Chapuis d311f779c0 test all chain manipulation methods 2023-08-23 17:49:59 +02:00
Pierre Chapuis 802970e79a simplify Chain#append 2023-08-23 17:49:59 +02:00
Pierre Chapuis beacfe816b reordering (match chain.py order) 2023-08-23 17:49:59 +02:00
Pierre Chapuis e05c410a86 split test in two 2023-08-23 17:49:59 +02:00
Pierre Chapuis 337d2aea58 cosmetics 2023-08-23 17:49:59 +02:00
Pierre Chapuis 16618d73de remove useless uses of type: ignore 2023-08-23 17:49:59 +02:00
Pierre Chapuis 1065dfe10b add empty __init__.py files to make pytest happy
(otherwise it wants unique file basenames)
2023-08-23 17:49:59 +02:00
Pierre Chapuis a0c70ba7aa add a test for StopIteration in walk 2023-08-23 12:15:56 +02:00
Pierre Chapuis dec0d64432 make walk and layers not recurse by default
There is now a parameter to get the old (recursive) behavior.
2023-08-23 12:15:56 +02:00
Pierre Chapuis 2ad26a06b0 fix LoRAs on Self target 2023-08-23 12:13:01 +02:00