Pierre Chapuis
cce2a98fa6
add sanity check to auto_attach_loras
2024-03-08 15:43:57 +01:00
Pierre Chapuis
1eb71077aa
use same scale setter / getter interface for all controls
2024-03-08 11:29:28 +01:00
Laurent
5e7986ef08
adding more log messages in training_utils
2024-03-08 10:52:14 +01:00
Pierre Chapuis
be2368cf20
ruff 3 formatting (Rye 0.28)
2024-03-08 10:42:05 +01:00
Pierre Chapuis
91d1b46aa9
Add a note that we mitigate non-zero SNR in DDIM.
2024-02-26 12:14:02 +01:00
Pierre Chapuis
7f51d18045
clarify that add_lcm_lora can load SDXL-Lightning
2024-02-26 12:14:02 +01:00
Pierre Chapuis
7e4e0f0650
correctly scale init latents for Euler scheduler
2024-02-26 12:14:02 +01:00
Pierre Chapuis
bf0ba58541
refactor solver params, add sample prediction type
2024-02-26 12:14:02 +01:00
Pierre Chapuis
ddc1cf8ca7
refactor solvers to support different timestep spacings
2024-02-26 12:14:02 +01:00
Cédric Deltheil
176807740b
control_lora: fix adapter set scale
...
The adapter set scale did not propagate the scale to the underlying
zero convolutions. The value set at CTOR time was used instead.
Follow up of #285
2024-02-22 10:01:05 +01:00
Pierre Chapuis
03b79d6d34
rename ResidualBlock to ConditionScaleBlock in LCM
2024-02-21 16:37:27 +01:00
Pierre Chapuis
684e2b9a47
add docstrings for LCM / LCM-LoRA
2024-02-21 16:37:27 +01:00
Pierre Chapuis
12b6829a26
add support for LCM LoRA weights loading
2024-02-21 16:37:27 +01:00
Pierre Chapuis
fafe5f8f5a
Improve filtering when auto-attaching LoRAs.
...
Also support debug output to help diagnose bad mappings.
2024-02-21 16:37:27 +01:00
Pierre Chapuis
f8d55ccb20
add LcmAdapter
...
This adds support for the condition scale embedding.
Also updates the UNet converter to support LCM.
2024-02-21 16:37:27 +01:00
Pierre Chapuis
c8c6294550
add LCMSolver (Latent Consistency Models)
2024-02-21 16:37:27 +01:00
Pierre Chapuis
4a619e84f0
support disabling CFG in LatentDiffusionModel
2024-02-21 16:37:27 +01:00
Pierre Colle
d199cd4f24
batch sdxl + sd1 + compute_clip_text_embedding
...
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-21 15:17:11 +01:00
Cédric Deltheil
5ab5d7fd1c
import ControlLoraAdapter part of latent_diffusion
2024-02-19 14:11:32 +01:00
Laurent
2a3e353f04
enable StyleAligned
related docstrings in mkdocstrings
2024-02-15 15:22:47 +01:00
Laurent
efa3988638
implement StyleAlignedAdapter
2024-02-15 15:22:47 +01:00
limiteinductive
432e32f94f
rename Scheduler -> LRScheduler
2024-02-15 11:48:36 +01:00
Laurent
684303230d
export ControlLora
and ControlLoraAdapter
in refiners.foundationals.latent_diffusion.stable_diffusion_xl
2024-02-15 11:32:49 +01:00
Laurent
41a5ce2052
implement ControlLora
and ControlLoraAdapter
2024-02-14 18:20:46 +01:00
Laurent
a54808e757
add context_key getter and setter to RangeAdapter2d
2024-02-14 18:20:46 +01:00
Laurent
35b6e2f7c5
add context_key getter and setter to TimestepEncoder
2024-02-14 18:20:46 +01:00
Laurent
0230971543
simplify is_compatible
in lora.py
2024-02-14 18:20:46 +01:00
Pierre Chapuis
35868ba34b
Move helper to attach several LoRAs from SD to Fluxion
2024-02-14 13:35:46 +01:00
limiteinductive
bec845553f
update deprecated validator for field_validator
2024-02-13 18:35:51 +01:00
limiteinductive
ab506b4db2
fix bug that was causing double registration
2024-02-13 11:12:13 +01:00
limiteinductive
3488273f50
Enforce correct subtype for the config param in both decorators
...
Also add a custom ModelConfig for the MockTrainer test
Update src/refiners/training_utils/config.py
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-12 16:21:04 +01:00
limiteinductive
0caa72a082
remove deprecated on_checkpoint_save
2024-02-12 16:21:04 +01:00
limiteinductive
cef8a9936c
refactor register_model decorator
2024-02-12 16:21:04 +01:00
limiteinductive
d6546c9026
add @register_model and @register_callback decorators
...
Refactor ClockTrainer to include Callback
2024-02-12 10:24:19 +01:00
limiteinductive
f541badcb3
Allow optional train ModelConfig + forbid extra input for configs
2024-02-10 16:13:10 +01:00
Pierre Chapuis
402d3105b4
support multiple IP adapter inputs as tensor
2024-02-09 17:16:17 +01:00
Cédric Deltheil
5a7085bb3a
training_utils/config.py: inline type alias
...
Follow up of #227
2024-02-09 14:36:22 +01:00
Pierre Colle
25bfa78907
lr, betas, eps, weight_decay at model level
...
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-09 12:05:13 +01:00
Colle
f4aa0271b8
less than 1 epoch training duration
2024-02-08 19:20:31 +01:00
limiteinductive
41508e0865
change param name of abstract get_item method
2024-02-08 18:52:52 +01:00
Cédric Deltheil
e36dda63fd
fix miscellaneous typos
2024-02-07 17:51:25 +01:00
Pierre Chapuis
396d166564
make pad method private
2024-02-07 17:47:14 +01:00
Pierre Chapuis
4d85918336
Update src/refiners/foundationals/latent_diffusion/lora.py
...
Co-authored-by: Laureηt <laurent@lagon.tech>
2024-02-07 17:47:14 +01:00
Pierre Chapuis
b1c200c63a
Update src/refiners/foundationals/latent_diffusion/lora.py
...
Co-authored-by: Laureηt <laurent@lagon.tech>
2024-02-07 17:47:14 +01:00
Pierre Chapuis
eb9abefe07
add a few comments in SDLoraManager
2024-02-07 17:47:14 +01:00
Benjamin Trom
bbe0759151
fix docstring
2024-02-07 16:13:01 +01:00
limiteinductive
2e526d35d1
Make Dataset part of the trainer
2024-02-07 16:13:01 +01:00
Laurent
9883f24f9a
(fluxion/layers) remove View
layer
...
+ replace existing `View` layers by `Reshape`
2024-02-07 12:06:07 +01:00
limiteinductive
2ef4982e04
remove wandb from base config
2024-02-07 11:06:59 +01:00
Pierre Chapuis
11da76f7df
fix sdxl structural copy
2024-02-07 10:51:26 +01:00