Commit graph

604 commits

Author SHA1 Message Date
Cédric Deltheil 12e37f5d85 controlnet: replace Lambda w/ Slicing basic layer 2023-09-12 15:37:33 +02:00
Pierre Chapuis 7a32699cc6 add ensure_find and ensure_find_parent helpers 2023-09-12 14:19:10 +02:00
Pierre Chapuis b69dbc4e5c improve CrossAttentionAdapter test 2023-09-12 11:58:24 +02:00
Pierre Chapuis dc2c3e0163 implement CrossAttentionAdapter using chain operations 2023-09-12 11:58:24 +02:00
Pierre Chapuis 43075f60b0 do not use get_parameter_name in conversion script 2023-09-12 11:58:24 +02:00
Pierre Chapuis 3c056e2231 expose lookup_top_adapter 2023-09-12 11:58:24 +02:00
Cédric Deltheil f4e9707297 sdxl test: refreshed reference image
The former one was generated using SDXL 0.9 vs 1.0. The new one has been
generated with diffusers:

    import torch
    from diffusers import StableDiffusionXLPipeline, DDIMScheduler

    noise_scheduler = DDIMScheduler(
        num_train_timesteps=1000,
        beta_start=0.00085,
        beta_end=0.012,
        beta_schedule="scaled_linear",
        clip_sample=False,
        set_alpha_to_one=False,
        steps_offset=1,
    )

    base_model_path = "/path/to/stabilityai/stable-diffusion-xl-base-1.0"

    device = "cuda"
    prompt = "a cute cat, detailed high-quality professional image"
    negative_prompt = "lowres, bad anatomy, bad hands, cropped, worst quality"
    seed = 2

    pipe = StableDiffusionXLPipeline.from_pretrained(base_model_path, scheduler=noise_scheduler, torch_dtype=torch.float16, add_watermarker=False)
    pipe = pipe.to(device)
    generator = torch.Generator(device).manual_seed(seed)
    images = pipe(prompt=prompt, negative_prompt=negative_prompt, num_inference_steps=30, generator=generator).images
2023-09-12 10:59:26 +02:00
Cédric Deltheil 0e38928c8d sdxl test: add missing torch no_grad 2023-09-12 10:59:26 +02:00
Benjamin Trom b515c02867 add new basic layers and Matmul chain 2023-09-12 10:55:34 +02:00
Doryan Kaced 2f2510a9b1 Use bias correction on Prodigy 2023-09-12 10:44:05 +02:00
Cédric Deltheil 9364c0ea1c converters: get rid of default=True for --half 2023-09-11 21:49:24 +02:00
Cédric Deltheil 32cba1afd8 test_sdxl_double_encoder: use proper weights 2023-09-11 21:49:24 +02:00
Cédric Deltheil cc3b20320d make clip text converter support SDXL
i.e. convert the 2nd text encoder and save the final double text encoder
2023-09-11 21:49:24 +02:00
Pierre Chapuis be54cfc016 fix weight loading for float16 LoRAs 2023-09-11 16:14:19 +02:00
Pierre Chapuis dd0cca5855 use float32 reference for textual inversion (fixes tests on CPU) 2023-09-11 16:11:53 +02:00
Cédric Deltheil e5425e2968 make IP-Adapter generic for SD1 and SDXL 2023-09-08 16:38:01 +02:00
Cédric Deltheil 61858d9371 add CLIPImageEncoderG 2023-09-08 12:00:21 +02:00
Cédric Deltheil 946e7c2974 add threshold for clip image encoder conversion 2023-09-08 12:00:21 +02:00
Cédric Deltheil c6fadd1c81 deprecate bidirectional_mapping util 2023-09-07 18:43:20 +02:00
limiteinductive 2786117469 implement SDXL + e2e test on random init 2023-09-07 18:34:42 +02:00
limiteinductive 02af8e9f0b improve typing of ldm and sd1, introducing SD1Autoencoder class 2023-09-07 18:34:42 +02:00
Pierre Chapuis 78e69c7da0 fix typo + skip test if weights are not available 2023-09-07 17:31:20 +02:00
Benjamin Trom cf43cb191f Add better tree representation for fluxion Module 2023-09-07 16:33:24 +02:00
Pierre Chapuis d9a461e9b5 stop relying on SDXL 0.9 weights in test 2023-09-07 12:18:38 +02:00
Pierre Chapuis d54a38ae07 do not hardcode a CUDA device in tests 2023-09-06 19:33:48 +02:00
Cédric Deltheil c55917e293 add IP-Adapter support for SD 1.5
Official repo: https://github.com/tencent-ailab/IP-Adapter
2023-09-06 15:12:48 +02:00
Cédric Deltheil d4dd45fd4d use Module's load_from_safetensors
Instead of manual calls to load_state_dict
2023-09-06 15:06:51 +02:00
Pierre Chapuis 4388968ad3 Update tests/e2e/test_diffusion.py
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2023-09-06 11:49:55 +02:00
Pierre Chapuis 547a73e67a clarify the "adapting when a LoRA is injected" issue in tests 2023-09-06 11:49:55 +02:00
Pierre Chapuis 864937a776 support injecting several LoRAs simultaneously 2023-09-06 11:49:55 +02:00
limiteinductive 88efa117bf fix model comparison with custom layers 2023-09-05 12:34:38 +02:00
Pierre Chapuis 7651daa01f update checkout action 2023-09-04 15:54:10 +02:00
Pierre Chapuis 566656a539 fix text encoder LoRAs 2023-09-04 15:51:39 +02:00
limiteinductive ebfa51f662 Make breakpoint a ContextModule 2023-09-04 12:22:10 +02:00
limiteinductive 5327d894d1 bump pyright version to 1.1.135 in poetry.lock 2023-09-04 10:41:06 +02:00
limiteinductive 9d2fbf6dbd Fix tuple annotation for pyright 1.1.325 2023-09-04 10:41:06 +02:00
Doryan Kaced 44e184d4d5 Init dtype and device correctly for OutputBlock 2023-09-01 19:44:06 +02:00
Cédric Deltheil 3a10baa9f8 cross-attn 2d: record use_bias attribute 2023-09-01 19:23:33 +02:00
Cédric Deltheil b933fabf31 unet: get rid of clip_embedding attribute for SD1
It is implicitly defined by the underlying cross-attention layer. This
also makes it consistent with SDXL.
2023-09-01 19:23:33 +02:00
Cédric Deltheil 134ee7b754 sdxl: remove wrong structural_attrs in cross-attn 2023-09-01 19:23:33 +02:00
Pierre Chapuis e91e31ebd2 check no two controlnets have the same name 2023-09-01 17:47:29 +02:00
Pierre Chapuis bd59790e08 always respect _can_refresh_parent 2023-09-01 17:44:16 +02:00
Pierre Chapuis d389d11a06 make basic adapters a part of Fluxion 2023-09-01 17:29:48 +02:00
Pierre Chapuis 31785f2059 scope range adapter in latent diffusion 2023-09-01 17:29:48 +02:00
Pierre Chapuis 73813310d0 rename SelfAttentionInjection to ReferenceOnlyControl and vice-versa 2023-09-01 17:29:48 +02:00
Pierre Chapuis eba0c33001 allow lora_targets to take a list of targets as input 2023-09-01 11:52:39 +02:00
Cédric Deltheil 92cdf19eae add Distribute to fluxion layers's __init__.py 2023-09-01 11:20:48 +02:00
Pierre Chapuis 9cf622a6e2 fix LoRA training script 2023-09-01 10:26:27 +02:00
Doryan Kaced 9f6733de8e Add concepts learning via textual inversion 2023-08-31 16:07:53 +02:00
Pierre Chapuis 0f476ea18b make high-level adapters Adapters
This generalizes the Adapter abstraction to higher-level
constructs such as high-level LoRA (targeting e.g. the
SD UNet), ControlNet and Reference-Only Control.

Some adapters now work by adapting child models with
"sub-adapters" that they inject / eject when needed.
2023-08-31 10:57:18 +02:00