Commit graph

669 commits

Author SHA1 Message Date
Laurent b8fae60d38 make LoRA's weight initialization overridable 2024-03-13 17:32:16 +01:00
Pierre Chapuis c1b3a52141 set refine.rs home title 2024-03-13 16:34:42 +01:00
Pierre Chapuis e32d8d16f0 LoRA loading: forward exclusions when preprocessing parts of the UNet 2024-03-13 15:25:00 +01:00
limiteinductive ff5341c85c Change weight decay for Optimizer to normal PyTorch default 2024-03-12 15:20:21 +01:00
Laurent 46612a5138 fix stalebot message config 2024-03-11 17:05:14 +01:00
Pierre Chapuis 975560165c improve docstrings 2024-03-08 15:43:57 +01:00
Pierre Chapuis 5d784bedab add test for "Adapting SDXL" guide 2024-03-08 15:43:57 +01:00
Pierre Chapuis cd5fa97c20 ability to get LoRA weights in SDLoraManager 2024-03-08 15:43:57 +01:00
Pierre Chapuis fb90b00e75 add_loras_to_unet: add preprocess values as exclusions in last step 2024-03-08 15:43:57 +01:00
Pierre Chapuis 4259261f17 simplify LCM weights loader using new manager features 2024-03-08 15:43:57 +01:00
Pierre Chapuis ccd9414ff1 fix debug map when attaching two LoRAs
(in that case return the path of the LoraAdapter)
2024-03-08 15:43:57 +01:00
Pierre Chapuis 72fa17df48 fix slider loras test 2024-03-08 15:43:57 +01:00
Pierre Chapuis 8c7fcbc00f LoRA manager: move exclude / include to add_loras call
Always exclude the TimestepEncoder by default.
This is because some keys include both e.g. `resnet` and `time_emb_proj`.

Preprocess blocks that tend to mix up with others in a separate
auto_attach call.
2024-03-08 15:43:57 +01:00
Pierre Chapuis 052a20b897 remove add_multiple_loras 2024-03-08 15:43:57 +01:00
Pierre Chapuis c383ff6cf4 fix DPO LoRA loading in tests 2024-03-08 15:43:57 +01:00
Pierre Chapuis ed8ec26e63 allow passing inclusions and exlusions to SDLoraManager 2024-03-08 15:43:57 +01:00
Pierre Chapuis cce2a98fa6 add sanity check to auto_attach_loras 2024-03-08 15:43:57 +01:00
limiteinductive 5593b40073 fix training 101 guide inconsistencies 2024-03-08 14:24:34 +01:00
Pierre Chapuis 1eb71077aa use same scale setter / getter interface for all controls 2024-03-08 11:29:28 +01:00
Laurent 5e7986ef08 adding more log messages in training_utils 2024-03-08 10:52:14 +01:00
Pierre Chapuis be2368cf20 ruff 3 formatting (Rye 0.28) 2024-03-08 10:42:05 +01:00
Pierre Chapuis a0be5458b9 snip long prompt in tests 2024-03-05 19:54:44 +01:00
Pierre Chapuis defbb9eb3a update deps and use ruff in Rye to format 2024-03-05 19:40:52 +01:00
Laurent d26ec690e8 (CI) add bounty stale bot 2024-03-05 14:13:59 +01:00
Pierre Chapuis 98e2ab94c9 fix CI (again)
https://github.com/eifinger/setup-rye/releases/tag/v2.0.0
2024-03-04 11:28:40 +01:00
Pierre Chapuis 83ff8f7007 fix CI
https://github.com/eifinger/setup-rye/releases/tag/v1.16.1
2024-03-02 14:07:13 +01:00
Cédric Deltheil 7792cffea5
bump library version to v0.4.0 2024-02-26 14:54:53 +01:00
Benjamin Trom f1079fe8f1 write Training 101 guide 2024-02-26 14:44:02 +01:00
Pierre Chapuis 91d1b46aa9 Add a note that we mitigate non-zero SNR in DDIM. 2024-02-26 12:14:02 +01:00
Pierre Chapuis 87f798778f update README 2024-02-26 12:14:02 +01:00
Pierre Chapuis d5d199edc5 add tests for SDXL Lightning 2024-02-26 12:14:02 +01:00
Pierre Chapuis 7d8e3fc1db add SDXL-Lightning weights to conversion script + support safetensors 2024-02-26 12:14:02 +01:00
Pierre Chapuis 7f51d18045 clarify that add_lcm_lora can load SDXL-Lightning 2024-02-26 12:14:02 +01:00
Pierre Chapuis 7e4e0f0650 correctly scale init latents for Euler scheduler 2024-02-26 12:14:02 +01:00
Pierre Chapuis bf0ba58541 refactor solver params, add sample prediction type 2024-02-26 12:14:02 +01:00
Pierre Chapuis ddc1cf8ca7 refactor solvers to support different timestep spacings 2024-02-26 12:14:02 +01:00
Pierre Chapuis d14c5bd5f8 add option to override unet weights for conversion 2024-02-26 12:14:02 +01:00
Pierre Chapuis 8f614e7647 check hash of downloaded LoRA weights, update DPO refs
(the DPO LoRA weights have changed: 2699b36e22)
2024-02-23 12:02:18 +01:00
Laurent 28f9368c93 (weight conversion) fix typo in ControlLora export folder 2024-02-22 17:59:36 +01:00
Cédric Deltheil 176807740b control_lora: fix adapter set scale
The adapter set scale did not propagate the scale to the underlying
zero convolutions. The value set at CTOR time was used instead.

Follow up of #285
2024-02-22 10:01:05 +01:00
Pierre Chapuis 83960bdbb8 Add LCM and LCM-LoRA to README 2024-02-21 16:37:27 +01:00
Pierre Chapuis 03b79d6d34 rename ResidualBlock to ConditionScaleBlock in LCM 2024-02-21 16:37:27 +01:00
Pierre Chapuis 5f21922925 update disk space estimation in CONTRIBUTING.md 2024-02-21 16:37:27 +01:00
Pierre Chapuis 684e2b9a47 add docstrings for LCM / LCM-LoRA 2024-02-21 16:37:27 +01:00
Pierre Chapuis 383c3c8a04 add tests for LCM and LCM-LoRA
(As of now LoRA with guidance > 1 and especially base do not pass with those tolerances.)
2024-02-21 16:37:27 +01:00
Pierre Chapuis b55e9332fe add LCM and LCM-LoRA to tests weights conversion script 2024-02-21 16:37:27 +01:00
Pierre Chapuis 12b6829a26 add support for LCM LoRA weights loading 2024-02-21 16:37:27 +01:00
Pierre Chapuis fafe5f8f5a Improve filtering when auto-attaching LoRAs.
Also support debug output to help diagnose bad mappings.
2024-02-21 16:37:27 +01:00
Pierre Chapuis f8d55ccb20 add LcmAdapter
This adds support for the condition scale embedding.
Also updates the UNet converter to support LCM.
2024-02-21 16:37:27 +01:00
Pierre Chapuis c8c6294550 add LCMSolver (Latent Consistency Models) 2024-02-21 16:37:27 +01:00