Commit graph

321 commits

Author SHA1 Message Date
Laurent 17246708b9 Add sample_noise staticmethod and modify add_noise to support batched steps 2024-04-18 12:55:49 +02:00
limiteinductive 7427c171f6 fix training_utils requirements check 2024-04-17 18:10:28 +02:00
Pierre Colle bf7852b88e SAM: image_to_scaled_tensor gray images 2024-04-16 18:45:17 +02:00
Laurent eb4bb34f8b (training_utils) add new ForceCommit callback 2024-04-16 14:43:10 +02:00
limiteinductive be7d065a33 Add DataloadeConfig to Trainer 2024-04-15 20:56:19 +02:00
limiteinductive b9b999ccfe turn scoped_seed into a context manager 2024-04-13 15:03:35 +02:00
Pierre Colle 64692c3b5b TrainerClock: assert dataset_length >= batch_size 2024-04-12 15:05:52 +02:00
Pierre Colle 0ac290f67d SAM: expose sizing helpers 2024-04-12 08:56:23 +02:00
Laurent 06ff2f0a5f add support for dinov2 giant flavors 2024-04-11 14:48:33 +02:00
Laurent 04e59bf3d9 fix GLU Activation docstrings 2024-04-11 14:48:33 +02:00
limiteinductive f26b6ee00a add static typing to __call__ method for latent_diffusion models ; fix multi_diffusion bug that wasn't taking guidance_scale into account 2024-04-11 12:13:30 +02:00
Cédric Deltheil a2ee705783 hq sam: add constructor args to docstring
Additionally, mark `register_adapter_module` for internal use.
2024-04-08 11:46:37 +02:00
Pierre Colle d05ebb8dd3 SAM/HQSAMAdapter: docstring examples 2024-04-08 07:12:57 +02:00
hugojarkoff bbb46e3fc7 Fix clock step inconsistencies on batch end 2024-04-05 15:52:43 +02:00
Pierre Chapuis 09af570b23 add DINOv2-FD metric 2024-04-03 16:45:00 +02:00
Laurent 5f07fa9c21 fix dinov2 interpolation, support batching 2024-04-02 18:57:25 +02:00
Pierre Chapuis fd5a15c7e0 update pyright and fix Pillow 10.3 typing issues 2024-04-02 18:15:52 +02:00
Laurent 4f94dfb494 implement dinov2 positional embedding interpolation 2024-04-02 10:02:43 +02:00
Laurent 0336bc78b5 simplify interpolate function and layer 2024-04-02 10:02:43 +02:00
Pierre Colle 6c37e3f933 hq-sam : weights/load_weights 2024-03-29 11:25:43 +01:00
Pierre Chapuis 404a15aad2 tweak auto_attach_loras so debugging is easier when it fails 2024-03-26 16:12:48 +01:00
Laurent a0715806d2 modify ip_adapter's ImageCrossAttention scale getter and setter
this new version makes it robust in case mulitple Mulitply-s are inside the Chain (e.g. if the Linear layers are LoRA-ified)
2024-03-26 11:15:04 +01:00
Laurent 7e64ba4011 modify ip_adapter's CrossAttentionAdapters injection logic 2024-03-26 11:15:04 +01:00
Cédric Deltheil df0cc2aeb8 do not call __getattr__ with keyword argument
Same for __setattr__. Use positional arguments instead. E.g.:

    import torch
    import refiners.fluxion.layers as fl
    m = torch.compile(fl.Linear(1,1))
    m(torch.zeros(1))
    # TypeError: Module.__getattr__() got an unexpected keyword argument 'name'
2024-03-25 21:46:13 +01:00
Pierre Colle cba83b0558 SAM init with mask_decoder after #325 2024-03-24 20:18:57 +01:00
Pierre Chapuis 364e196874 support no CFG in compute_clip_text_embedding 2024-03-22 17:06:51 +01:00
Pierre Colle 94e8b9c23f SAM MaskDecoder token slicing 2024-03-22 13:11:40 +01:00
hugojarkoff a93ceff752 Add HQ-SAM Adapter 2024-03-21 15:36:55 +01:00
limiteinductive 38c86f59f4 Switch gradient clipping to native torch torch.nn.utils.clip_grad_norm_ 2024-03-19 22:08:48 +01:00
Pierre Colle 68fe725767 Add multimask_output flag to SAM 2024-03-19 17:40:26 +01:00
limiteinductive 6a72943ff7 change TimeValue to a dataclass 2024-03-19 14:49:24 +01:00
Laurent b8fae60d38 make LoRA's weight initialization overridable 2024-03-13 17:32:16 +01:00
Pierre Chapuis e32d8d16f0 LoRA loading: forward exclusions when preprocessing parts of the UNet 2024-03-13 15:25:00 +01:00
limiteinductive ff5341c85c Change weight decay for Optimizer to normal PyTorch default 2024-03-12 15:20:21 +01:00
Pierre Chapuis 975560165c improve docstrings 2024-03-08 15:43:57 +01:00
Pierre Chapuis cd5fa97c20 ability to get LoRA weights in SDLoraManager 2024-03-08 15:43:57 +01:00
Pierre Chapuis fb90b00e75 add_loras_to_unet: add preprocess values as exclusions in last step 2024-03-08 15:43:57 +01:00
Pierre Chapuis 4259261f17 simplify LCM weights loader using new manager features 2024-03-08 15:43:57 +01:00
Pierre Chapuis ccd9414ff1 fix debug map when attaching two LoRAs
(in that case return the path of the LoraAdapter)
2024-03-08 15:43:57 +01:00
Pierre Chapuis 8c7fcbc00f LoRA manager: move exclude / include to add_loras call
Always exclude the TimestepEncoder by default.
This is because some keys include both e.g. `resnet` and `time_emb_proj`.

Preprocess blocks that tend to mix up with others in a separate
auto_attach call.
2024-03-08 15:43:57 +01:00
Pierre Chapuis 052a20b897 remove add_multiple_loras 2024-03-08 15:43:57 +01:00
Pierre Chapuis ed8ec26e63 allow passing inclusions and exlusions to SDLoraManager 2024-03-08 15:43:57 +01:00
Pierre Chapuis cce2a98fa6 add sanity check to auto_attach_loras 2024-03-08 15:43:57 +01:00
Pierre Chapuis 1eb71077aa use same scale setter / getter interface for all controls 2024-03-08 11:29:28 +01:00
Laurent 5e7986ef08 adding more log messages in training_utils 2024-03-08 10:52:14 +01:00
Pierre Chapuis be2368cf20 ruff 3 formatting (Rye 0.28) 2024-03-08 10:42:05 +01:00
Pierre Chapuis 91d1b46aa9 Add a note that we mitigate non-zero SNR in DDIM. 2024-02-26 12:14:02 +01:00
Pierre Chapuis 7f51d18045 clarify that add_lcm_lora can load SDXL-Lightning 2024-02-26 12:14:02 +01:00
Pierre Chapuis 7e4e0f0650 correctly scale init latents for Euler scheduler 2024-02-26 12:14:02 +01:00
Pierre Chapuis bf0ba58541 refactor solver params, add sample prediction type 2024-02-26 12:14:02 +01:00