Commit graph

172 commits

Author SHA1 Message Date
Pierre Chapuis ce0339b4cc add a get_path helper to modules 2024-01-26 19:31:13 +01:00
limiteinductive 0ee2d5e075 Fix warmup steps calculation when gradient_accumulation is used 2024-01-25 12:20:36 +01:00
Bryce 12a5439fc4 refactor: rename noise => predicted_noise
and in euler, `alt_noise` can now be simply `noise`
2024-01-24 18:15:10 +01:00
limiteinductive 3b458f0d8d fix test_names LoraManager test 2024-01-23 14:12:03 +01:00
limiteinductive 421da6a3b6 Load Multiple LoRAs with SDLoraManager 2024-01-23 14:12:03 +01:00
Cédric Deltheil 40c33b9595 rollback to 50 inference steps in IP-Adapter tests
Follow up of 8a36c8c

This is what is used in the official notebook (ip_adapter_demo.ipynb and
ip_adapter-plus_demo.ipynb)
2024-01-22 09:35:31 +01:00
limiteinductive ed3621362f Add load_tensors utils in fluxion 2024-01-21 12:34:33 +01:00
Pierre Colle 91aea9b7ff fix: summarize_tensor(tensor) when tensor.numel() == 0 2024-01-20 14:32:35 +01:00
Pierre Chapuis 8a36c8c279 make the first diffusion step a first class property of LDM & Schedulers 2024-01-19 18:52:45 +01:00
hugojarkoff 42b7749630 Fix references for e2e tests 2024-01-19 15:00:03 +01:00
Pierre Chapuis ce3035923b improve DPM solver test 2024-01-18 19:23:11 +01:00
hugojarkoff 17d9701dde Remove additional noise in final sample of DDIM inference process 2024-01-18 18:43:13 +01:00
limiteinductive a1f50f3f9d refactor Lora LoraAdapter and the latent_diffusion/lora file 2024-01-18 16:27:38 +01:00
limiteinductive 2b977bc69e fix broken self-attention guidance with ip-adapter
The #168 and #177 refactorings caused this regression. A new end-to-end
test has been added for proper coverage.

(This fix will be revisited at some point)
2024-01-16 17:21:24 +01:00
limiteinductive d9ae7ca6a5 cast to float32 before converting to image in tensor_to_image to fix bfloat16 conversion 2024-01-16 11:50:58 +01:00
limiteinductive 7f722029be add basic unit test for training_utils 2024-01-14 22:08:20 +01:00
Colle dba9065229
fix test_debug_print
Follow-up of #173
2024-01-12 18:32:22 +01:00
Colle c141091afc
Make summarize_tensor robust to non-float dtypes (#171) 2024-01-11 09:57:58 +01:00
Cédric Deltheil ce0f9887a3 test_schedulers: fix pyright error
Due to changes in diffusers 0.25.0
2024-01-10 16:53:06 +01:00
Cédric Deltheil 6dbaec3e56 add end-to-end test for euler scheduler
Reference image generated with diffusers [1]

[1]: tests/e2e/test_diffusion_ref/README.md#expected-outputs
2024-01-10 16:53:06 +01:00
Cédric Deltheil 2b2b6740b7 fix or silent pyright issues 2024-01-10 16:53:06 +01:00
Cédric Deltheil 65f19d192f ruff fix 2024-01-10 16:53:06 +01:00
Cédric Deltheil ad143b0867 ruff format 2024-01-10 16:53:06 +01:00
Israfel Salazar 8423c5efa7
feature: Euler scheduler (#138) 2024-01-10 11:32:40 +01:00
limiteinductive c9e973ba41 refactor CrossAttentionAdapter to work with context. 2024-01-08 15:20:23 +01:00
hugojarkoff 00f494efe2 SegmentAnything: add dense mask prompt support 2024-01-05 18:53:25 +01:00
limiteinductive 20c229903f upgrade pyright to 1.1.342 ; improve no_grad typing 2023-12-29 15:09:02 +01:00
Cédric Deltheil 22ce3fd033 sam: wrap high-level methods with no_grad 2023-12-19 21:45:23 +01:00
Cédric Deltheil e7892254eb dinov2: add some coverage for registers
Those are not supported yet in HF: so just compared with a precomputed
norm. Note: in the initial PR [1] the Refiners' implementation has been
tested against the official code using Torch Hub.

[1]:
https://github.com/finegrain-ai/refiners/pull/132#issuecomment-1852021656
2023-12-18 10:29:28 +01:00
Cédric Deltheil 68cc346905 add minimal unit tests for DINOv2
To be completed with tests using image preprocessing, e.g. test cosine
similarity on a relevant pair of images
2023-12-18 10:29:28 +01:00
Benjamin Trom e2f2e33add Update tests/fluxion/layers/test_basics.py
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2023-12-13 17:03:28 +01:00
limiteinductive 7d9ceae274 change default behavior of end to None 2023-12-13 17:03:28 +01:00
Cédric Deltheil 82a2aa1ec4 deprecate DDPM step which is unused for now 2023-12-13 15:51:42 +01:00
limiteinductive a7551e0392 Change fl.Slicing API 2023-12-13 09:38:13 +01:00
Cédric Deltheil 315b4ed2e4 test_schedulers: enforce manual seed 2023-12-12 17:26:14 +01:00
Cédric Deltheil 792a0fc3d9 run lint rules using latest isort settings 2023-12-11 11:58:43 +01:00
limiteinductive 86c54977b9 replace poetry by rye for python dependency management
Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
Co-authored-by: Pierre Chapuis <git@catwell.info>
2023-12-08 17:40:10 +01:00
limiteinductive 0dc3a17fbf remove unnecessary test 2023-12-04 15:27:06 +01:00
limiteinductive 8bbcf2048d add README bullet point 2023-12-04 15:27:06 +01:00
limiteinductive 37a74bd549 format test_scheduler file 2023-12-04 15:27:06 +01:00
limiteinductive 90db6ef59d add e2e test for sd15 with karras noise schedule 2023-12-04 15:27:06 +01:00
limiteinductive 6f110ee2b2 fix test_scheduler_utils 2023-12-04 15:27:06 +01:00
Pierre Chapuis f22f969d65 remove Black preview mode
also fix multiline logs in training
2023-12-04 14:15:56 +01:00
Cédric Deltheil b306c7db1b freeu: add one more test for identity scales
It should act as a NOP when [1.0, 1.0] is used for backbone and skip
scales.
2023-12-01 12:48:19 +01:00
Cédric Deltheil ab0915d052 add tests for FreeU 2023-11-18 16:15:44 +01:00
Pierre Chapuis 02f3c46e2e update pyright 2023-10-25 14:56:07 +02:00
Benjamin Trom ea44262a39 unnest Residual subchain by modifying its forward
And replaced the remaining Sum-Identity layers by Residual.

The tolerance used to compare SAM's ViT models has been tweaked: for
some reasons there is a small difference (in float32) in the neck layer
(first conv2D)

Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
2023-10-19 10:34:51 +02:00
Cédric Deltheil 46dd710076 test_converter: use proper exception type
Follow up of #102
2023-10-18 14:39:24 +02:00
Benjamin Trom 6ddd901767 improve image_to_tensor and tensor_to_image utils 2023-10-17 18:08:58 +02:00
limiteinductive 7a62049d54 implement Restart method for latent diffusion 2023-10-12 15:48:43 +02:00
Benjamin Trom 0024191c58 improve debug print for chains 2023-10-10 15:25:09 +02:00
Benjamin Trom a663375dc7 prevent setattr pytorch module to register on the Chain class 2023-10-10 14:46:15 +02:00
Cédric Deltheil d02be0d10e tests: update ref image for SDXL IP-Adapter plus
Note: https://pytorch.org/docs/stable/notes/randomness.html

> Completely reproducible results are not guaranteed across PyTorch
> releases [...]
2023-10-10 14:19:47 +02:00
Cédric Deltheil b80769939d add support for self-attention guidance
See https://arxiv.org/abs/2210.00939
2023-10-09 17:33:15 +02:00
Cédric Deltheil 05126c8f4d make gaussian_blur work with float16 2023-10-07 21:48:38 +02:00
Cédric Deltheil 7d2abf6fbc scheduler: add remove noise
aka original sample prediction (or predict x0)

E.g. useful for methods like self-attention guidance (see equation (2)
in https://arxiv.org/pdf/2210.00939.pdf)
2023-10-05 17:05:15 +02:00
Cédric Deltheil 665bcdc95c add unit tests covering fluxion's gaussian_blur 2023-10-05 16:30:27 +02:00
Cédric Deltheil 338042f332 test_diffusion: remove debug leftovers 2023-09-29 18:54:24 +02:00
Cédric Deltheil 5fc6767a4a add IP-Adapter plus (aka fine-grained features) 2023-09-29 15:23:43 +02:00
Cédric Deltheil 63f5723449 test_concepts: silent static type checker error 2023-09-25 13:54:26 +02:00
Cédric Deltheil f37f25a2e4 add e2e test for T2I-Adapter XL canny 2023-09-25 13:54:26 +02:00
Cédric Deltheil 4301e81eb3 add e2e test for T2I-Adapter depth
Expected output generated with diffusers' StableDiffusionAdapterPipeline
2023-09-25 13:54:26 +02:00
Cédric Deltheil d72e1d3478 chain: add insert_before_type 2023-09-25 13:54:26 +02:00
Doryan Kaced 251277a0a8 Fix module registration in IP-Adapter 2023-09-22 17:34:55 +02:00
Pierre Chapuis cd1fdb5585 fix scheduler device choice 2023-09-21 12:00:19 +02:00
Benjamin Trom 282578ddc0 add Segment Anything (SAM) to foundational models
Note: dense prompts (i.e. masks) support is still partial (see MaskEncoder)

Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
2023-09-21 11:44:30 +02:00
Cédric Deltheil 2faff9f57a ldm: properly resize non-square init image 2023-09-20 10:27:22 +02:00
Benjamin Trom 01aeaf3e36 add unit test for multi_diffusion 2023-09-19 15:30:50 +02:00
Pierre Chapuis fc2390ad1c fix legacy wording for refonly control 2023-09-14 11:21:11 +02:00
Pierre Chapuis 0e0c39b4b5 black 2023-09-13 17:02:47 +02:00
Pierre Chapuis c421cfd56c add a test for IP-Adapter + ControlNet 2023-09-13 14:24:53 +02:00
Pierre Chapuis cf9efb57c8 remove useless torch.no_grad() contexts 2023-09-13 11:14:09 +02:00
Cédric Deltheil eea340c6c4 add support for SDXL IP-Adapter
This only supports the latest SDXL IP-Adapter release (2023.9.8) which
builds upon the ViT-H/14 CLIP image encoder.
2023-09-12 18:00:39 +02:00
Cédric Deltheil 1b4dcebe06 make scheduler an actual abstract base class 2023-09-12 16:47:47 +02:00
Pierre Chapuis 7a32699cc6 add ensure_find and ensure_find_parent helpers 2023-09-12 14:19:10 +02:00
Pierre Chapuis b69dbc4e5c improve CrossAttentionAdapter test 2023-09-12 11:58:24 +02:00
Pierre Chapuis dc2c3e0163 implement CrossAttentionAdapter using chain operations 2023-09-12 11:58:24 +02:00
Pierre Chapuis 3c056e2231 expose lookup_top_adapter 2023-09-12 11:58:24 +02:00
Cédric Deltheil f4e9707297 sdxl test: refreshed reference image
The former one was generated using SDXL 0.9 vs 1.0. The new one has been
generated with diffusers:

    import torch
    from diffusers import StableDiffusionXLPipeline, DDIMScheduler

    noise_scheduler = DDIMScheduler(
        num_train_timesteps=1000,
        beta_start=0.00085,
        beta_end=0.012,
        beta_schedule="scaled_linear",
        clip_sample=False,
        set_alpha_to_one=False,
        steps_offset=1,
    )

    base_model_path = "/path/to/stabilityai/stable-diffusion-xl-base-1.0"

    device = "cuda"
    prompt = "a cute cat, detailed high-quality professional image"
    negative_prompt = "lowres, bad anatomy, bad hands, cropped, worst quality"
    seed = 2

    pipe = StableDiffusionXLPipeline.from_pretrained(base_model_path, scheduler=noise_scheduler, torch_dtype=torch.float16, add_watermarker=False)
    pipe = pipe.to(device)
    generator = torch.Generator(device).manual_seed(seed)
    images = pipe(prompt=prompt, negative_prompt=negative_prompt, num_inference_steps=30, generator=generator).images
2023-09-12 10:59:26 +02:00
Cédric Deltheil 0e38928c8d sdxl test: add missing torch no_grad 2023-09-12 10:59:26 +02:00
Cédric Deltheil 32cba1afd8 test_sdxl_double_encoder: use proper weights 2023-09-11 21:49:24 +02:00
Pierre Chapuis be54cfc016 fix weight loading for float16 LoRAs 2023-09-11 16:14:19 +02:00
Pierre Chapuis dd0cca5855 use float32 reference for textual inversion (fixes tests on CPU) 2023-09-11 16:11:53 +02:00
Cédric Deltheil e5425e2968 make IP-Adapter generic for SD1 and SDXL 2023-09-08 16:38:01 +02:00
limiteinductive 2786117469 implement SDXL + e2e test on random init 2023-09-07 18:34:42 +02:00
Pierre Chapuis 78e69c7da0 fix typo + skip test if weights are not available 2023-09-07 17:31:20 +02:00
Pierre Chapuis d9a461e9b5 stop relying on SDXL 0.9 weights in test 2023-09-07 12:18:38 +02:00
Pierre Chapuis d54a38ae07 do not hardcode a CUDA device in tests 2023-09-06 19:33:48 +02:00
Cédric Deltheil c55917e293 add IP-Adapter support for SD 1.5
Official repo: https://github.com/tencent-ailab/IP-Adapter
2023-09-06 15:12:48 +02:00
Cédric Deltheil d4dd45fd4d use Module's load_from_safetensors
Instead of manual calls to load_state_dict
2023-09-06 15:06:51 +02:00
Pierre Chapuis 4388968ad3 Update tests/e2e/test_diffusion.py
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2023-09-06 11:49:55 +02:00
Pierre Chapuis 547a73e67a clarify the "adapting when a LoRA is injected" issue in tests 2023-09-06 11:49:55 +02:00
Pierre Chapuis 864937a776 support injecting several LoRAs simultaneously 2023-09-06 11:49:55 +02:00
limiteinductive 88efa117bf fix model comparison with custom layers 2023-09-05 12:34:38 +02:00
Cédric Deltheil b933fabf31 unet: get rid of clip_embedding attribute for SD1
It is implicitly defined by the underlying cross-attention layer. This
also makes it consistent with SDXL.
2023-09-01 19:23:33 +02:00
Pierre Chapuis e91e31ebd2 check no two controlnets have the same name 2023-09-01 17:47:29 +02:00
Pierre Chapuis d389d11a06 make basic adapters a part of Fluxion 2023-09-01 17:29:48 +02:00
Pierre Chapuis 31785f2059 scope range adapter in latent diffusion 2023-09-01 17:29:48 +02:00
Pierre Chapuis 73813310d0 rename SelfAttentionInjection to ReferenceOnlyControl and vice-versa 2023-09-01 17:29:48 +02:00
Doryan Kaced 9f6733de8e Add concepts learning via textual inversion 2023-08-31 16:07:53 +02:00