Laurent
7e64ba4011
modify ip_adapter's CrossAttentionAdapters injection logic
2024-03-26 11:15:04 +01:00
Cédric Deltheil
df0cc2aeb8
do not call __getattr__ with keyword argument
...
Same for __setattr__. Use positional arguments instead. E.g.:
import torch
import refiners.fluxion.layers as fl
m = torch.compile(fl.Linear(1,1))
m(torch.zeros(1))
# TypeError: Module.__getattr__() got an unexpected keyword argument 'name'
2024-03-25 21:46:13 +01:00
Pierre Colle
cba83b0558
SAM init with mask_decoder after #325
2024-03-24 20:18:57 +01:00
Pierre Chapuis
364e196874
support no CFG in compute_clip_text_embedding
2024-03-22 17:06:51 +01:00
Pierre Colle
94e8b9c23f
SAM MaskDecoder token slicing
2024-03-22 13:11:40 +01:00
hugojarkoff
a93ceff752
Add HQ-SAM Adapter
2024-03-21 15:36:55 +01:00
limiteinductive
38c86f59f4
Switch gradient clipping to native torch torch.nn.utils.clip_grad_norm_
2024-03-19 22:08:48 +01:00
Pierre Colle
68fe725767
Add multimask_output flag to SAM
2024-03-19 17:40:26 +01:00
limiteinductive
6a72943ff7
change TimeValue to a dataclass
2024-03-19 14:49:24 +01:00
Laurent
b8fae60d38
make LoRA's weight initialization overridable
2024-03-13 17:32:16 +01:00
Pierre Chapuis
e32d8d16f0
LoRA loading: forward exclusions when preprocessing parts of the UNet
2024-03-13 15:25:00 +01:00
limiteinductive
ff5341c85c
Change weight decay for Optimizer to normal PyTorch default
2024-03-12 15:20:21 +01:00
Pierre Chapuis
975560165c
improve docstrings
2024-03-08 15:43:57 +01:00
Pierre Chapuis
cd5fa97c20
ability to get LoRA weights in SDLoraManager
2024-03-08 15:43:57 +01:00
Pierre Chapuis
fb90b00e75
add_loras_to_unet: add preprocess values as exclusions in last step
2024-03-08 15:43:57 +01:00
Pierre Chapuis
4259261f17
simplify LCM weights loader using new manager features
2024-03-08 15:43:57 +01:00
Pierre Chapuis
ccd9414ff1
fix debug map when attaching two LoRAs
...
(in that case return the path of the LoraAdapter)
2024-03-08 15:43:57 +01:00
Pierre Chapuis
8c7fcbc00f
LoRA manager: move exclude / include to add_loras call
...
Always exclude the TimestepEncoder by default.
This is because some keys include both e.g. `resnet` and `time_emb_proj`.
Preprocess blocks that tend to mix up with others in a separate
auto_attach call.
2024-03-08 15:43:57 +01:00
Pierre Chapuis
052a20b897
remove add_multiple_loras
2024-03-08 15:43:57 +01:00
Pierre Chapuis
ed8ec26e63
allow passing inclusions and exlusions to SDLoraManager
2024-03-08 15:43:57 +01:00
Pierre Chapuis
cce2a98fa6
add sanity check to auto_attach_loras
2024-03-08 15:43:57 +01:00
Pierre Chapuis
1eb71077aa
use same scale setter / getter interface for all controls
2024-03-08 11:29:28 +01:00
Laurent
5e7986ef08
adding more log messages in training_utils
2024-03-08 10:52:14 +01:00
Pierre Chapuis
be2368cf20
ruff 3 formatting (Rye 0.28)
2024-03-08 10:42:05 +01:00
Pierre Chapuis
91d1b46aa9
Add a note that we mitigate non-zero SNR in DDIM.
2024-02-26 12:14:02 +01:00
Pierre Chapuis
7f51d18045
clarify that add_lcm_lora can load SDXL-Lightning
2024-02-26 12:14:02 +01:00
Pierre Chapuis
7e4e0f0650
correctly scale init latents for Euler scheduler
2024-02-26 12:14:02 +01:00
Pierre Chapuis
bf0ba58541
refactor solver params, add sample prediction type
2024-02-26 12:14:02 +01:00
Pierre Chapuis
ddc1cf8ca7
refactor solvers to support different timestep spacings
2024-02-26 12:14:02 +01:00
Cédric Deltheil
176807740b
control_lora: fix adapter set scale
...
The adapter set scale did not propagate the scale to the underlying
zero convolutions. The value set at CTOR time was used instead.
Follow up of #285
2024-02-22 10:01:05 +01:00
Pierre Chapuis
03b79d6d34
rename ResidualBlock to ConditionScaleBlock in LCM
2024-02-21 16:37:27 +01:00
Pierre Chapuis
684e2b9a47
add docstrings for LCM / LCM-LoRA
2024-02-21 16:37:27 +01:00
Pierre Chapuis
12b6829a26
add support for LCM LoRA weights loading
2024-02-21 16:37:27 +01:00
Pierre Chapuis
fafe5f8f5a
Improve filtering when auto-attaching LoRAs.
...
Also support debug output to help diagnose bad mappings.
2024-02-21 16:37:27 +01:00
Pierre Chapuis
f8d55ccb20
add LcmAdapter
...
This adds support for the condition scale embedding.
Also updates the UNet converter to support LCM.
2024-02-21 16:37:27 +01:00
Pierre Chapuis
c8c6294550
add LCMSolver (Latent Consistency Models)
2024-02-21 16:37:27 +01:00
Pierre Chapuis
4a619e84f0
support disabling CFG in LatentDiffusionModel
2024-02-21 16:37:27 +01:00
Pierre Colle
d199cd4f24
batch sdxl + sd1 + compute_clip_text_embedding
...
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-21 15:17:11 +01:00
Cédric Deltheil
5ab5d7fd1c
import ControlLoraAdapter part of latent_diffusion
2024-02-19 14:11:32 +01:00
Laurent
2a3e353f04
enable StyleAligned
related docstrings in mkdocstrings
2024-02-15 15:22:47 +01:00
Laurent
efa3988638
implement StyleAlignedAdapter
2024-02-15 15:22:47 +01:00
limiteinductive
432e32f94f
rename Scheduler -> LRScheduler
2024-02-15 11:48:36 +01:00
Laurent
684303230d
export ControlLora
and ControlLoraAdapter
in refiners.foundationals.latent_diffusion.stable_diffusion_xl
2024-02-15 11:32:49 +01:00
Laurent
41a5ce2052
implement ControlLora
and ControlLoraAdapter
2024-02-14 18:20:46 +01:00
Laurent
a54808e757
add context_key getter and setter to RangeAdapter2d
2024-02-14 18:20:46 +01:00
Laurent
35b6e2f7c5
add context_key getter and setter to TimestepEncoder
2024-02-14 18:20:46 +01:00
Laurent
0230971543
simplify is_compatible
in lora.py
2024-02-14 18:20:46 +01:00
Pierre Chapuis
35868ba34b
Move helper to attach several LoRAs from SD to Fluxion
2024-02-14 13:35:46 +01:00
limiteinductive
bec845553f
update deprecated validator for field_validator
2024-02-13 18:35:51 +01:00
limiteinductive
ab506b4db2
fix bug that was causing double registration
2024-02-13 11:12:13 +01:00
limiteinductive
3488273f50
Enforce correct subtype for the config param in both decorators
...
Also add a custom ModelConfig for the MockTrainer test
Update src/refiners/training_utils/config.py
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-12 16:21:04 +01:00
limiteinductive
0caa72a082
remove deprecated on_checkpoint_save
2024-02-12 16:21:04 +01:00
limiteinductive
cef8a9936c
refactor register_model decorator
2024-02-12 16:21:04 +01:00
limiteinductive
d6546c9026
add @register_model and @register_callback decorators
...
Refactor ClockTrainer to include Callback
2024-02-12 10:24:19 +01:00
limiteinductive
f541badcb3
Allow optional train ModelConfig + forbid extra input for configs
2024-02-10 16:13:10 +01:00
Pierre Chapuis
402d3105b4
support multiple IP adapter inputs as tensor
2024-02-09 17:16:17 +01:00
Cédric Deltheil
5a7085bb3a
training_utils/config.py: inline type alias
...
Follow up of #227
2024-02-09 14:36:22 +01:00
Pierre Colle
25bfa78907
lr, betas, eps, weight_decay at model level
...
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-09 12:05:13 +01:00
Colle
f4aa0271b8
less than 1 epoch training duration
2024-02-08 19:20:31 +01:00
limiteinductive
41508e0865
change param name of abstract get_item method
2024-02-08 18:52:52 +01:00
Cédric Deltheil
e36dda63fd
fix miscellaneous typos
2024-02-07 17:51:25 +01:00
Pierre Chapuis
396d166564
make pad method private
2024-02-07 17:47:14 +01:00
Pierre Chapuis
4d85918336
Update src/refiners/foundationals/latent_diffusion/lora.py
...
Co-authored-by: Laureηt <laurent@lagon.tech>
2024-02-07 17:47:14 +01:00
Pierre Chapuis
b1c200c63a
Update src/refiners/foundationals/latent_diffusion/lora.py
...
Co-authored-by: Laureηt <laurent@lagon.tech>
2024-02-07 17:47:14 +01:00
Pierre Chapuis
eb9abefe07
add a few comments in SDLoraManager
2024-02-07 17:47:14 +01:00
Benjamin Trom
bbe0759151
fix docstring
2024-02-07 16:13:01 +01:00
limiteinductive
2e526d35d1
Make Dataset part of the trainer
2024-02-07 16:13:01 +01:00
Laurent
9883f24f9a
(fluxion/layers) remove View
layer
...
+ replace existing `View` layers by `Reshape`
2024-02-07 12:06:07 +01:00
limiteinductive
2ef4982e04
remove wandb from base config
2024-02-07 11:06:59 +01:00
Pierre Chapuis
11da76f7df
fix sdxl structural copy
2024-02-07 10:51:26 +01:00
Pierre Chapuis
ca9e89b22a
cosmetics
2024-02-07 10:51:26 +01:00
limiteinductive
ea05f3d327
make device and dtype work in Trainer class
2024-02-06 23:10:10 +01:00
Pierre Chapuis
98fce82853
fix 37425fb609
...
Things to understand:
- subscripted generic basic types (e.g. `list[int]`) are types.GenericAlias;
- subscripted generic classes are `typing._GenericAlias`;
- neither can be used with `isinstance()`;
- get_origin is the cleanest way to check for this.
2024-02-06 13:49:37 +01:00
Pierre Chapuis
37425fb609
make LoRA generic
2024-02-06 11:32:18 +01:00
Pierre Chapuis
471ef91d1c
make __getattr__
on Module return object, not Any
...
PyTorch chose to make it Any because they expect its users' code
to be "highly dynamic": https://github.com/pytorch/pytorch/pull/104321
It is not the case for us, in Refiners having untyped code
goes contrary to one of our core principles.
Note that there is currently an open PR in PyTorch to
return `Module | Tensor`, but in practice this is not always
correct either: https://github.com/pytorch/pytorch/pull/115074
I also moved Residuals-related code from SD1 to latent_diffusion
because SDXL should not depend on SD1.
2024-02-06 11:32:18 +01:00
Laurent
8d190e4256
(fluxion/layers/activations) replace ApproximateGeLU
by GeLUApproximation
2024-02-02 19:41:18 +01:00
Pierre Chapuis
fbb1fcb8ff
Chain#pop does not return tuples
2024-02-02 18:11:51 +01:00
Laurent
1dcb36e1e0
(doc/foundationals) add IPAdapter
, related docstrings
2024-02-02 17:35:03 +01:00
Laurent
6b35f1cc84
(doc/foundationals) add SDLoraManager
, related docstrings
2024-02-02 17:35:03 +01:00
Laurent
7406d8e01f
(mkdocs) fix cross-reference typo
2024-02-02 17:35:03 +01:00
Laurent
093527a7de
apply @deltheil suggestions
2024-02-02 17:35:03 +01:00
Laurent
f62e71da1c
(doc/foundationals) add SegmentAnything
, related docstrings
2024-02-02 17:35:03 +01:00
Laurent
a926696141
(doc/foundationals) add CLIP
, related docstrings
2024-02-02 17:35:03 +01:00
Laurent
3910845e29
(doc/foundationals) add DINOv2
, related docstrings
2024-02-02 17:35:03 +01:00
Laurent
fc7b4dd62d
(doc/fluxion/ld) add DDPM
, DDIM
, DPM++
and Euleur
docstrings
2024-02-02 17:35:03 +01:00
Laurent
a1a00998ea
(doc/fluxion/ld) add StableDiffusion_1
docstrings
2024-02-02 17:35:03 +01:00
Laurent
f2bcb7f45e
(mkdocstrings) export SDXLAutoencoder
in src/refiners/foundationals/latent_diffusion/stable_diffusion_xl/__init__.py
2024-02-02 17:35:03 +01:00
Laurent
2a7b86ac02
(doc/fluxion/ld) add LatentDiffusionAutoencoder
docstrings
2024-02-02 17:35:03 +01:00
Laurent
effd95a1bd
(doc/fluxion/ld) add SDXLAutoencoder
docstrings
2024-02-02 17:35:03 +01:00
Laurent
0c5a7a8269
(doc/fluxion/ld) add Solver
docstrings
2024-02-02 17:35:03 +01:00
Laurent
289261f2fb
(doc/fluxion/ld) add SD1UNet
docstrings
2024-02-02 17:35:03 +01:00
Laurent
fae08c058e
(doc/fluxion/ld) add SDXLUNet docstrings
2024-02-02 17:35:03 +01:00
Laurent
7307a3686e
(docstrings) apply @deltheil suggestions
2024-02-02 11:08:21 +01:00
Laurent
fe53cda5e2
(doc/fluxion/adapter) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
4054421854
(doc/fluxion/lora) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
9fb9df5f91
(doc/fluxion) export Activation
and ScaledDotProductAttention
2024-02-02 11:08:21 +01:00
Laurent
c7fd1496b5
(doc/fluxion/chain) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
77b97b3c8e
(doc/fluxion/basic) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
a7c048f5fb
(doc/fluxion/activations) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
0fc3264fae
(doc/fluxion/attention) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
e3238a6af5
(doc/fluxion/module) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
c31da03bad
(doc/fluxion/model_converter) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
9590703f99
(doc/fluxion/context) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
e79c2bdde5
(doc/fluxion/utils) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
12a8dd6c85
(doc/fluxion/linear) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
cf20621894
(doc/fluxion/conv) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
beb6dfb1c4
(doc/fluxion/embedding) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
fc824bd53d
(doc/fluxion/converter) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
6d09974b8d
(doc/fluxion/padding) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
08349d97d7
(doc/fluxion/sampling) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
be75f68893
(doc/fluxion/pixelshuffle) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
18682f8155
(doc/fluxion/maxpool) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
49847658e9
(doc/fluxion/norm) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Colle
4a6146bb6c
clip text, lda encode batch inputs
...
* text_encoder([str1, str2])
* lda decode_latents/encode_image image_to_latent/latent_to_image
* images_to_tensor, tensor_to_images
---------
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-01 17:05:28 +01:00
Pierre Chapuis
12aa0b23f6
remove stochastic Euler
...
It was untested and likely doesn't work.
We will re-introduce it later if needed.
2024-02-01 16:17:07 +01:00
Pierre Chapuis
5ac5373310
add a test for SDXL with sliced attention
2024-02-01 16:17:07 +01:00
Pierre Chapuis
df843f5226
test SAG setter
2024-02-01 16:17:07 +01:00
Pierre Chapuis
0e77ef1720
add inject / eject test for concept extender (+ better errors)
2024-02-01 16:17:07 +01:00
Pierre Chapuis
bca50b71f2
test (and fix) basic_attributes
2024-02-01 16:17:07 +01:00
Pierre Chapuis
be961af4d9
remove Chain.__add__
2024-02-01 16:17:07 +01:00
Pierre Chapuis
07954a55ab
remove unused Conv1d
2024-02-01 16:17:07 +01:00
Pierre Chapuis
86867e9318
remove unused class CrossAttention in SAM
2024-02-01 16:17:07 +01:00
Pierre Chapuis
a1ad317b00
remove Buffer
2024-02-01 16:17:07 +01:00
Pierre Chapuis
e6be1394ff
remove unused Chunk and Unbind layers
2024-02-01 16:17:07 +01:00
Pierre Chapuis
c57f2228f8
remove unused helper (since LoRA refactoring)
2024-02-01 16:17:07 +01:00
Pierre Chapuis
ae19892d1d
remove unused ViT variations
2024-02-01 16:17:07 +01:00
Pierre Chapuis
849c0058df
remove unused dunder methods on ContextProvider
2024-02-01 16:17:07 +01:00
limiteinductive
abe50076a4
add NoiseSchedule to solvers __init__ + simplify some import pathing
...
further improve import pathing
2024-01-31 17:03:52 +01:00
limiteinductive
73f6ccfc98
make Scheduler a fl.Module + Change name Scheduler -> Solver
2024-01-31 17:03:52 +01:00
Pierre Chapuis
7eb8eb4c68
add support for pytorch 2.2 (2.1 is still supported)
...
also bump all dev dependencies to their latest version
2024-01-31 15:03:06 +01:00
Cédric Deltheil
ca5c5a7ca5
add helper for multiple image prompts
2024-01-31 11:03:49 +01:00
Cédric Deltheil
fd01ba910e
fix minor typos in code and docs
2024-01-30 09:52:40 +01:00
Cédric Deltheil
feff4c78ae
segment-anything: fix class name typo
...
Note: weights are impacted
2024-01-30 09:52:40 +01:00
Pierre Chapuis
deb5e930ae
fix exclusions for Downsample and Upsample
...
(also simplify list comprehension for exclusion list)
2024-01-29 11:11:14 +01:00
Pierre Chapuis
83c95fcf44
fix sorting method for LoRA keys
...
- support _out_0
- sort _in before _out
- avoid false positives by only considering suffixes
2024-01-29 11:11:14 +01:00
Pierre Chapuis
ce22c8f51b
fix detection of unet-only LoRAs
2024-01-29 11:11:14 +01:00
Pierre Chapuis
ce0339b4cc
add a get_path
helper to modules
2024-01-26 19:31:13 +01:00
limiteinductive
0ee2d5e075
Fix warmup steps calculation when gradient_accumulation is used
2024-01-25 12:20:36 +01:00
Bryce
12a5439fc4
refactor: rename noise => predicted_noise
...
and in euler, `alt_noise` can now be simply `noise`
2024-01-24 18:15:10 +01:00
Cédric Deltheil
695c24dd3a
image_prompt: remove obsolete comment
...
Not needed anymore since #168 (CrossAttentionAdapter refactoring)
2024-01-24 09:44:00 +01:00
limiteinductive
421da6a3b6
Load Multiple LoRAs with SDLoraManager
2024-01-23 14:12:03 +01:00
Pierre Chapuis
fb2f0e28d4
add rebuild()
to Scheduler interface
...
for use in `set_inference_steps()`
2024-01-23 11:11:50 +01:00
Pierre Chapuis
a5c665462a
add missing constructor arguments to DDPM scheduler
2024-01-23 11:11:50 +01:00
limiteinductive
ed3621362f
Add load_tensors utils in fluxion
2024-01-21 12:34:33 +01:00
Pierre Colle
91aea9b7ff
fix: summarize_tensor(tensor) when tensor.numel() == 0
2024-01-20 14:32:35 +01:00
Pierre Chapuis
f6beee8388
tweak docstring for DDPM
2024-01-19 19:01:02 +01:00
Pierre Chapuis
8a36c8c279
make the first diffusion step a first class property of LDM & Schedulers
2024-01-19 18:52:45 +01:00
Pierre Chapuis
de6266010d
fix typo (sinuosoidal -> sinusoidal)
2024-01-19 14:37:45 +01:00
Pierre Chapuis
d34e36797b
fix typo (sinuosidal -> sinusoidal)
2024-01-19 14:27:37 +01:00
Cédric Deltheil
fde61757fb
summarize_tensor: fix minor warning
...
Calling `tensor.float()` on a complex tensor raises a warning:
UserWarning: Casting complex values to real discards the imaginary
part (Triggered internally at ../aten/src/ATen/native/Copy.cpp:299.)
Follow up of #171
2024-01-19 11:34:47 +01:00