Commit graph

191 commits

Author SHA1 Message Date
hugojarkoff a93ceff752 Add HQ-SAM Adapter 2024-03-21 15:36:55 +01:00
hugojarkoff c6b5eb24a1 Add logits comparison for base SAM in single mask output prediction mode 2024-03-21 10:48:48 +01:00
limiteinductive 38c86f59f4 Switch gradient clipping to native torch torch.nn.utils.clip_grad_norm_ 2024-03-19 22:08:48 +01:00
Pierre Colle 68fe725767 Add multimask_output flag to SAM 2024-03-19 17:40:26 +01:00
limiteinductive 6a72943ff7 change TimeValue to a dataclass 2024-03-19 14:49:24 +01:00
Pierre Chapuis 5d784bedab add test for "Adapting SDXL" guide 2024-03-08 15:43:57 +01:00
Pierre Chapuis 72fa17df48 fix slider loras test 2024-03-08 15:43:57 +01:00
Pierre Chapuis 8c7fcbc00f LoRA manager: move exclude / include to add_loras call
Always exclude the TimestepEncoder by default.
This is because some keys include both e.g. `resnet` and `time_emb_proj`.

Preprocess blocks that tend to mix up with others in a separate
auto_attach call.
2024-03-08 15:43:57 +01:00
Pierre Chapuis 052a20b897 remove add_multiple_loras 2024-03-08 15:43:57 +01:00
Pierre Chapuis c383ff6cf4 fix DPO LoRA loading in tests 2024-03-08 15:43:57 +01:00
Pierre Chapuis 1eb71077aa use same scale setter / getter interface for all controls 2024-03-08 11:29:28 +01:00
Pierre Chapuis be2368cf20 ruff 3 formatting (Rye 0.28) 2024-03-08 10:42:05 +01:00
Pierre Chapuis a0be5458b9 snip long prompt in tests 2024-03-05 19:54:44 +01:00
Pierre Chapuis d5d199edc5 add tests for SDXL Lightning 2024-02-26 12:14:02 +01:00
Pierre Chapuis 7e4e0f0650 correctly scale init latents for Euler scheduler 2024-02-26 12:14:02 +01:00
Pierre Chapuis bf0ba58541 refactor solver params, add sample prediction type 2024-02-26 12:14:02 +01:00
Pierre Chapuis ddc1cf8ca7 refactor solvers to support different timestep spacings 2024-02-26 12:14:02 +01:00
Pierre Chapuis 8f614e7647 check hash of downloaded LoRA weights, update DPO refs
(the DPO LoRA weights have changed: 2699b36e22)
2024-02-23 12:02:18 +01:00
Cédric Deltheil 176807740b control_lora: fix adapter set scale
The adapter set scale did not propagate the scale to the underlying
zero convolutions. The value set at CTOR time was used instead.

Follow up of #285
2024-02-22 10:01:05 +01:00
Pierre Chapuis 684e2b9a47 add docstrings for LCM / LCM-LoRA 2024-02-21 16:37:27 +01:00
Pierre Chapuis 383c3c8a04 add tests for LCM and LCM-LoRA
(As of now LoRA with guidance > 1 and especially base do not pass with those tolerances.)
2024-02-21 16:37:27 +01:00
Pierre Chapuis c8c6294550 add LCMSolver (Latent Consistency Models) 2024-02-21 16:37:27 +01:00
Cédric Deltheil 446967859d test_style_aligned: switch to CLIP text batch API
Added in #263
2024-02-21 16:33:03 +01:00
Pierre Colle d199cd4f24 batch sdxl + sd1 + compute_clip_text_embedding
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-21 15:17:11 +01:00
Cédric Deltheil 5ab5d7fd1c import ControlLoraAdapter part of latent_diffusion 2024-02-19 14:11:32 +01:00
Laurent da3c3602fb write StyleAligned e2e test 2024-02-15 15:22:47 +01:00
Laurent 60c0780fe7 write StyleAligned inject/eject tests 2024-02-15 15:22:47 +01:00
limiteinductive 432e32f94f rename Scheduler -> LRScheduler 2024-02-15 11:48:36 +01:00
Laurent 00270604ef fix conversion_script bug + rename control_lora e2e test 2024-02-14 18:20:46 +01:00
Laurent 7fe392298a write ControlLora e2e tests 2024-02-14 18:20:46 +01:00
limiteinductive bec845553f update deprecated validator for field_validator 2024-02-13 18:35:51 +01:00
limiteinductive ab506b4db2 fix bug that was causing double registration 2024-02-13 11:12:13 +01:00
limiteinductive 3488273f50 Enforce correct subtype for the config param in both decorators
Also add a custom ModelConfig for the MockTrainer test

Update src/refiners/training_utils/config.py

Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-12 16:21:04 +01:00
limiteinductive cef8a9936c refactor register_model decorator 2024-02-12 16:21:04 +01:00
limiteinductive d6546c9026 add @register_model and @register_callback decorators
Refactor ClockTrainer to include Callback
2024-02-12 10:24:19 +01:00
limiteinductive f541badcb3 Allow optional train ModelConfig + forbid extra input for configs 2024-02-10 16:13:10 +01:00
Pierre Colle 25bfa78907 lr, betas, eps, weight_decay at model level
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-09 12:05:13 +01:00
Cédric Deltheil 9aefc9896c test_trainer: use model_copy instead of copy
The `copy` method has been deprecated.
2024-02-08 20:07:34 +01:00
Colle f4aa0271b8
less than 1 epoch training duration 2024-02-08 19:20:31 +01:00
limiteinductive 41508e0865 change param name of abstract get_item method 2024-02-08 18:52:52 +01:00
Laurent 6d599d53fd beautify EXPECTED_TREE in test_chain.py 2024-02-08 15:09:47 +01:00
limiteinductive 2e526d35d1 Make Dataset part of the trainer 2024-02-07 16:13:01 +01:00
limiteinductive 2ef4982e04 remove wandb from base config 2024-02-07 11:06:59 +01:00
Pierre Chapuis 11da76f7df fix sdxl structural copy 2024-02-07 10:51:26 +01:00
Pierre Chapuis ca9e89b22a cosmetics 2024-02-07 10:51:26 +01:00
limiteinductive ea05f3d327 make device and dtype work in Trainer class 2024-02-06 23:10:10 +01:00
Pierre Chapuis 37425fb609 make LoRA generic 2024-02-06 11:32:18 +01:00
Pierre Chapuis 471ef91d1c make __getattr__ on Module return object, not Any
PyTorch chose to make it Any because they expect its users' code
to be "highly dynamic": https://github.com/pytorch/pytorch/pull/104321

It is not the case for us, in Refiners having untyped code
goes contrary to one of our core principles.

Note that there is currently an open PR in PyTorch to
return `Module | Tensor`, but in practice this is not always
correct either: https://github.com/pytorch/pytorch/pull/115074

I also moved Residuals-related code from SD1 to latent_diffusion
because SDXL should not depend on SD1.
2024-02-06 11:32:18 +01:00
Pierre Chapuis 3de1508b65 increase tolerance on Euler test 2024-02-04 08:58:22 +01:00
Pierre Chapuis 83b478c0ff fix test failure caused by Diffusers 0.26.0 2024-02-04 08:58:22 +01:00