Commit graph

319 commits

Author SHA1 Message Date
Pierre Chapuis 03b79d6d34 rename ResidualBlock to ConditionScaleBlock in LCM 2024-02-21 16:37:27 +01:00
Pierre Chapuis 684e2b9a47 add docstrings for LCM / LCM-LoRA 2024-02-21 16:37:27 +01:00
Pierre Chapuis 12b6829a26 add support for LCM LoRA weights loading 2024-02-21 16:37:27 +01:00
Pierre Chapuis fafe5f8f5a Improve filtering when auto-attaching LoRAs.
Also support debug output to help diagnose bad mappings.
2024-02-21 16:37:27 +01:00
Pierre Chapuis f8d55ccb20 add LcmAdapter
This adds support for the condition scale embedding.
Also updates the UNet converter to support LCM.
2024-02-21 16:37:27 +01:00
Pierre Chapuis c8c6294550 add LCMSolver (Latent Consistency Models) 2024-02-21 16:37:27 +01:00
Pierre Chapuis 4a619e84f0 support disabling CFG in LatentDiffusionModel 2024-02-21 16:37:27 +01:00
Pierre Colle d199cd4f24 batch sdxl + sd1 + compute_clip_text_embedding
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-21 15:17:11 +01:00
Cédric Deltheil 5ab5d7fd1c import ControlLoraAdapter part of latent_diffusion 2024-02-19 14:11:32 +01:00
Laurent 2a3e353f04 enable StyleAligned related docstrings in mkdocstrings 2024-02-15 15:22:47 +01:00
Laurent efa3988638 implement StyleAlignedAdapter 2024-02-15 15:22:47 +01:00
limiteinductive 432e32f94f rename Scheduler -> LRScheduler 2024-02-15 11:48:36 +01:00
Laurent 684303230d export ControlLora and ControlLoraAdapter in refiners.foundationals.latent_diffusion.stable_diffusion_xl 2024-02-15 11:32:49 +01:00
Laurent 41a5ce2052 implement ControlLora and ControlLoraAdapter 2024-02-14 18:20:46 +01:00
Laurent a54808e757 add context_key getter and setter to RangeAdapter2d 2024-02-14 18:20:46 +01:00
Laurent 35b6e2f7c5 add context_key getter and setter to TimestepEncoder 2024-02-14 18:20:46 +01:00
Laurent 0230971543 simplify is_compatible in lora.py 2024-02-14 18:20:46 +01:00
Pierre Chapuis 35868ba34b Move helper to attach several LoRAs from SD to Fluxion 2024-02-14 13:35:46 +01:00
limiteinductive bec845553f update deprecated validator for field_validator 2024-02-13 18:35:51 +01:00
limiteinductive ab506b4db2 fix bug that was causing double registration 2024-02-13 11:12:13 +01:00
limiteinductive 3488273f50 Enforce correct subtype for the config param in both decorators
Also add a custom ModelConfig for the MockTrainer test

Update src/refiners/training_utils/config.py

Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-12 16:21:04 +01:00
limiteinductive 0caa72a082 remove deprecated on_checkpoint_save 2024-02-12 16:21:04 +01:00
limiteinductive cef8a9936c refactor register_model decorator 2024-02-12 16:21:04 +01:00
limiteinductive d6546c9026 add @register_model and @register_callback decorators
Refactor ClockTrainer to include Callback
2024-02-12 10:24:19 +01:00
limiteinductive f541badcb3 Allow optional train ModelConfig + forbid extra input for configs 2024-02-10 16:13:10 +01:00
Pierre Chapuis 402d3105b4 support multiple IP adapter inputs as tensor 2024-02-09 17:16:17 +01:00
Cédric Deltheil 5a7085bb3a training_utils/config.py: inline type alias
Follow up of #227
2024-02-09 14:36:22 +01:00
Pierre Colle 25bfa78907 lr, betas, eps, weight_decay at model level
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-09 12:05:13 +01:00
Colle f4aa0271b8
less than 1 epoch training duration 2024-02-08 19:20:31 +01:00
limiteinductive 41508e0865 change param name of abstract get_item method 2024-02-08 18:52:52 +01:00
Cédric Deltheil e36dda63fd fix miscellaneous typos 2024-02-07 17:51:25 +01:00
Pierre Chapuis 396d166564 make pad method private 2024-02-07 17:47:14 +01:00
Pierre Chapuis 4d85918336 Update src/refiners/foundationals/latent_diffusion/lora.py
Co-authored-by: Laureηt <laurent@lagon.tech>
2024-02-07 17:47:14 +01:00
Pierre Chapuis b1c200c63a Update src/refiners/foundationals/latent_diffusion/lora.py
Co-authored-by: Laureηt <laurent@lagon.tech>
2024-02-07 17:47:14 +01:00
Pierre Chapuis eb9abefe07 add a few comments in SDLoraManager 2024-02-07 17:47:14 +01:00
Benjamin Trom bbe0759151 fix docstring 2024-02-07 16:13:01 +01:00
limiteinductive 2e526d35d1 Make Dataset part of the trainer 2024-02-07 16:13:01 +01:00
Laurent 9883f24f9a (fluxion/layers) remove View layer
+ replace existing `View` layers by `Reshape`
2024-02-07 12:06:07 +01:00
limiteinductive 2ef4982e04 remove wandb from base config 2024-02-07 11:06:59 +01:00
Pierre Chapuis 11da76f7df fix sdxl structural copy 2024-02-07 10:51:26 +01:00
Pierre Chapuis ca9e89b22a cosmetics 2024-02-07 10:51:26 +01:00
limiteinductive ea05f3d327 make device and dtype work in Trainer class 2024-02-06 23:10:10 +01:00
Pierre Chapuis 98fce82853 fix 37425fb609
Things to understand:

- subscripted generic basic types (e.g. `list[int]`) are types.GenericAlias;
- subscripted generic classes are `typing._GenericAlias`;
- neither can be used with `isinstance()`;
- get_origin is the cleanest way to check for this.
2024-02-06 13:49:37 +01:00
Pierre Chapuis 37425fb609 make LoRA generic 2024-02-06 11:32:18 +01:00
Pierre Chapuis 471ef91d1c make __getattr__ on Module return object, not Any
PyTorch chose to make it Any because they expect its users' code
to be "highly dynamic": https://github.com/pytorch/pytorch/pull/104321

It is not the case for us, in Refiners having untyped code
goes contrary to one of our core principles.

Note that there is currently an open PR in PyTorch to
return `Module | Tensor`, but in practice this is not always
correct either: https://github.com/pytorch/pytorch/pull/115074

I also moved Residuals-related code from SD1 to latent_diffusion
because SDXL should not depend on SD1.
2024-02-06 11:32:18 +01:00
Laurent 8d190e4256 (fluxion/layers/activations) replace ApproximateGeLU by GeLUApproximation 2024-02-02 19:41:18 +01:00
Pierre Chapuis fbb1fcb8ff Chain#pop does not return tuples 2024-02-02 18:11:51 +01:00
Laurent 1dcb36e1e0 (doc/foundationals) add IPAdapter, related docstrings 2024-02-02 17:35:03 +01:00
Laurent 6b35f1cc84 (doc/foundationals) add SDLoraManager, related docstrings 2024-02-02 17:35:03 +01:00
Laurent 7406d8e01f (mkdocs) fix cross-reference typo 2024-02-02 17:35:03 +01:00
Laurent 093527a7de apply @deltheil suggestions 2024-02-02 17:35:03 +01:00
Laurent f62e71da1c (doc/foundationals) add SegmentAnything, related docstrings 2024-02-02 17:35:03 +01:00
Laurent a926696141 (doc/foundationals) add CLIP, related docstrings 2024-02-02 17:35:03 +01:00
Laurent 3910845e29 (doc/foundationals) add DINOv2, related docstrings 2024-02-02 17:35:03 +01:00
Laurent fc7b4dd62d (doc/fluxion/ld) add DDPM, DDIM, DPM++ and Euleur docstrings 2024-02-02 17:35:03 +01:00
Laurent a1a00998ea (doc/fluxion/ld) add StableDiffusion_1 docstrings 2024-02-02 17:35:03 +01:00
Laurent f2bcb7f45e (mkdocstrings) export SDXLAutoencoder in src/refiners/foundationals/latent_diffusion/stable_diffusion_xl/__init__.py 2024-02-02 17:35:03 +01:00
Laurent 2a7b86ac02 (doc/fluxion/ld) add LatentDiffusionAutoencoder docstrings 2024-02-02 17:35:03 +01:00
Laurent effd95a1bd (doc/fluxion/ld) add SDXLAutoencoder docstrings 2024-02-02 17:35:03 +01:00
Laurent 0c5a7a8269 (doc/fluxion/ld) add Solver docstrings 2024-02-02 17:35:03 +01:00
Laurent 289261f2fb (doc/fluxion/ld) add SD1UNet docstrings 2024-02-02 17:35:03 +01:00
Laurent fae08c058e (doc/fluxion/ld) add SDXLUNet docstrings 2024-02-02 17:35:03 +01:00
Laurent 7307a3686e (docstrings) apply @deltheil suggestions 2024-02-02 11:08:21 +01:00
Laurent fe53cda5e2 (doc/fluxion/adapter) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent 4054421854 (doc/fluxion/lora) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent 9fb9df5f91 (doc/fluxion) export Activation and ScaledDotProductAttention 2024-02-02 11:08:21 +01:00
Laurent c7fd1496b5 (doc/fluxion/chain) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent 77b97b3c8e (doc/fluxion/basic) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent a7c048f5fb (doc/fluxion/activations) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent 0fc3264fae (doc/fluxion/attention) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent e3238a6af5 (doc/fluxion/module) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent c31da03bad (doc/fluxion/model_converter) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent 9590703f99 (doc/fluxion/context) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent e79c2bdde5 (doc/fluxion/utils) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent 12a8dd6c85 (doc/fluxion/linear) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent cf20621894 (doc/fluxion/conv) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent beb6dfb1c4 (doc/fluxion/embedding) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent fc824bd53d (doc/fluxion/converter) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent 6d09974b8d (doc/fluxion/padding) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent 08349d97d7 (doc/fluxion/sampling) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent be75f68893 (doc/fluxion/pixelshuffle) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent 18682f8155 (doc/fluxion/maxpool) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Laurent 49847658e9 (doc/fluxion/norm) add/convert docstrings to mkdocstrings format 2024-02-02 11:08:21 +01:00
Colle 4a6146bb6c clip text, lda encode batch inputs
* text_encoder([str1, str2])
* lda decode_latents/encode_image image_to_latent/latent_to_image
* images_to_tensor, tensor_to_images
---------
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-01 17:05:28 +01:00
Pierre Chapuis 12aa0b23f6 remove stochastic Euler
It was untested and likely doesn't work.
We will re-introduce it later if needed.
2024-02-01 16:17:07 +01:00
Pierre Chapuis 5ac5373310 add a test for SDXL with sliced attention 2024-02-01 16:17:07 +01:00
Pierre Chapuis df843f5226 test SAG setter 2024-02-01 16:17:07 +01:00
Pierre Chapuis 0e77ef1720 add inject / eject test for concept extender (+ better errors) 2024-02-01 16:17:07 +01:00
Pierre Chapuis bca50b71f2 test (and fix) basic_attributes 2024-02-01 16:17:07 +01:00
Pierre Chapuis be961af4d9 remove Chain.__add__ 2024-02-01 16:17:07 +01:00
Pierre Chapuis 07954a55ab remove unused Conv1d 2024-02-01 16:17:07 +01:00
Pierre Chapuis 86867e9318 remove unused class CrossAttention in SAM 2024-02-01 16:17:07 +01:00
Pierre Chapuis a1ad317b00 remove Buffer 2024-02-01 16:17:07 +01:00
Pierre Chapuis e6be1394ff remove unused Chunk and Unbind layers 2024-02-01 16:17:07 +01:00
Pierre Chapuis c57f2228f8 remove unused helper (since LoRA refactoring) 2024-02-01 16:17:07 +01:00
Pierre Chapuis ae19892d1d remove unused ViT variations 2024-02-01 16:17:07 +01:00
Pierre Chapuis 849c0058df remove unused dunder methods on ContextProvider 2024-02-01 16:17:07 +01:00
limiteinductive abe50076a4 add NoiseSchedule to solvers __init__ + simplify some import pathing
further improve import pathing
2024-01-31 17:03:52 +01:00
limiteinductive 73f6ccfc98 make Scheduler a fl.Module + Change name Scheduler -> Solver 2024-01-31 17:03:52 +01:00
Pierre Chapuis 7eb8eb4c68 add support for pytorch 2.2 (2.1 is still supported)
also bump all dev dependencies to their latest version
2024-01-31 15:03:06 +01:00