Pierre Chapuis
eb9abefe07
add a few comments in SDLoraManager
2024-02-07 17:47:14 +01:00
Benjamin Trom
bbe0759151
fix docstring
2024-02-07 16:13:01 +01:00
limiteinductive
2e526d35d1
Make Dataset part of the trainer
2024-02-07 16:13:01 +01:00
Laurent
9883f24f9a
(fluxion/layers) remove View
layer
...
+ replace existing `View` layers by `Reshape`
2024-02-07 12:06:07 +01:00
limiteinductive
2ef4982e04
remove wandb from base config
2024-02-07 11:06:59 +01:00
Pierre Chapuis
11da76f7df
fix sdxl structural copy
2024-02-07 10:51:26 +01:00
Pierre Chapuis
ca9e89b22a
cosmetics
2024-02-07 10:51:26 +01:00
limiteinductive
ea05f3d327
make device and dtype work in Trainer class
2024-02-06 23:10:10 +01:00
Pierre Chapuis
98fce82853
fix 37425fb609
...
Things to understand:
- subscripted generic basic types (e.g. `list[int]`) are types.GenericAlias;
- subscripted generic classes are `typing._GenericAlias`;
- neither can be used with `isinstance()`;
- get_origin is the cleanest way to check for this.
2024-02-06 13:49:37 +01:00
Pierre Chapuis
37425fb609
make LoRA generic
2024-02-06 11:32:18 +01:00
Pierre Chapuis
471ef91d1c
make __getattr__
on Module return object, not Any
...
PyTorch chose to make it Any because they expect its users' code
to be "highly dynamic": https://github.com/pytorch/pytorch/pull/104321
It is not the case for us, in Refiners having untyped code
goes contrary to one of our core principles.
Note that there is currently an open PR in PyTorch to
return `Module | Tensor`, but in practice this is not always
correct either: https://github.com/pytorch/pytorch/pull/115074
I also moved Residuals-related code from SD1 to latent_diffusion
because SDXL should not depend on SD1.
2024-02-06 11:32:18 +01:00
Laurent
8d190e4256
(fluxion/layers/activations) replace ApproximateGeLU
by GeLUApproximation
2024-02-02 19:41:18 +01:00
Pierre Chapuis
fbb1fcb8ff
Chain#pop does not return tuples
2024-02-02 18:11:51 +01:00
Laurent
1dcb36e1e0
(doc/foundationals) add IPAdapter
, related docstrings
2024-02-02 17:35:03 +01:00
Laurent
6b35f1cc84
(doc/foundationals) add SDLoraManager
, related docstrings
2024-02-02 17:35:03 +01:00
Laurent
7406d8e01f
(mkdocs) fix cross-reference typo
2024-02-02 17:35:03 +01:00
Laurent
093527a7de
apply @deltheil suggestions
2024-02-02 17:35:03 +01:00
Laurent
f62e71da1c
(doc/foundationals) add SegmentAnything
, related docstrings
2024-02-02 17:35:03 +01:00
Laurent
a926696141
(doc/foundationals) add CLIP
, related docstrings
2024-02-02 17:35:03 +01:00
Laurent
3910845e29
(doc/foundationals) add DINOv2
, related docstrings
2024-02-02 17:35:03 +01:00
Laurent
fc7b4dd62d
(doc/fluxion/ld) add DDPM
, DDIM
, DPM++
and Euleur
docstrings
2024-02-02 17:35:03 +01:00
Laurent
a1a00998ea
(doc/fluxion/ld) add StableDiffusion_1
docstrings
2024-02-02 17:35:03 +01:00
Laurent
f2bcb7f45e
(mkdocstrings) export SDXLAutoencoder
in src/refiners/foundationals/latent_diffusion/stable_diffusion_xl/__init__.py
2024-02-02 17:35:03 +01:00
Laurent
2a7b86ac02
(doc/fluxion/ld) add LatentDiffusionAutoencoder
docstrings
2024-02-02 17:35:03 +01:00
Laurent
effd95a1bd
(doc/fluxion/ld) add SDXLAutoencoder
docstrings
2024-02-02 17:35:03 +01:00
Laurent
0c5a7a8269
(doc/fluxion/ld) add Solver
docstrings
2024-02-02 17:35:03 +01:00
Laurent
289261f2fb
(doc/fluxion/ld) add SD1UNet
docstrings
2024-02-02 17:35:03 +01:00
Laurent
fae08c058e
(doc/fluxion/ld) add SDXLUNet docstrings
2024-02-02 17:35:03 +01:00
Laurent
7307a3686e
(docstrings) apply @deltheil suggestions
2024-02-02 11:08:21 +01:00
Laurent
fe53cda5e2
(doc/fluxion/adapter) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
4054421854
(doc/fluxion/lora) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
9fb9df5f91
(doc/fluxion) export Activation
and ScaledDotProductAttention
2024-02-02 11:08:21 +01:00
Laurent
c7fd1496b5
(doc/fluxion/chain) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
77b97b3c8e
(doc/fluxion/basic) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
a7c048f5fb
(doc/fluxion/activations) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
0fc3264fae
(doc/fluxion/attention) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
e3238a6af5
(doc/fluxion/module) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
c31da03bad
(doc/fluxion/model_converter) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
9590703f99
(doc/fluxion/context) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
e79c2bdde5
(doc/fluxion/utils) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
12a8dd6c85
(doc/fluxion/linear) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
cf20621894
(doc/fluxion/conv) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
beb6dfb1c4
(doc/fluxion/embedding) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
fc824bd53d
(doc/fluxion/converter) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
6d09974b8d
(doc/fluxion/padding) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
08349d97d7
(doc/fluxion/sampling) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
be75f68893
(doc/fluxion/pixelshuffle) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
18682f8155
(doc/fluxion/maxpool) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Laurent
49847658e9
(doc/fluxion/norm) add/convert docstrings to mkdocstrings format
2024-02-02 11:08:21 +01:00
Colle
4a6146bb6c
clip text, lda encode batch inputs
...
* text_encoder([str1, str2])
* lda decode_latents/encode_image image_to_latent/latent_to_image
* images_to_tensor, tensor_to_images
---------
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-01 17:05:28 +01:00
Pierre Chapuis
12aa0b23f6
remove stochastic Euler
...
It was untested and likely doesn't work.
We will re-introduce it later if needed.
2024-02-01 16:17:07 +01:00
Pierre Chapuis
5ac5373310
add a test for SDXL with sliced attention
2024-02-01 16:17:07 +01:00
Pierre Chapuis
df843f5226
test SAG setter
2024-02-01 16:17:07 +01:00
Pierre Chapuis
0e77ef1720
add inject / eject test for concept extender (+ better errors)
2024-02-01 16:17:07 +01:00
Pierre Chapuis
bca50b71f2
test (and fix) basic_attributes
2024-02-01 16:17:07 +01:00
Pierre Chapuis
be961af4d9
remove Chain.__add__
2024-02-01 16:17:07 +01:00
Pierre Chapuis
07954a55ab
remove unused Conv1d
2024-02-01 16:17:07 +01:00
Pierre Chapuis
86867e9318
remove unused class CrossAttention in SAM
2024-02-01 16:17:07 +01:00
Pierre Chapuis
a1ad317b00
remove Buffer
2024-02-01 16:17:07 +01:00
Pierre Chapuis
e6be1394ff
remove unused Chunk and Unbind layers
2024-02-01 16:17:07 +01:00
Pierre Chapuis
c57f2228f8
remove unused helper (since LoRA refactoring)
2024-02-01 16:17:07 +01:00
Pierre Chapuis
ae19892d1d
remove unused ViT variations
2024-02-01 16:17:07 +01:00
Pierre Chapuis
849c0058df
remove unused dunder methods on ContextProvider
2024-02-01 16:17:07 +01:00
limiteinductive
abe50076a4
add NoiseSchedule to solvers __init__ + simplify some import pathing
...
further improve import pathing
2024-01-31 17:03:52 +01:00
limiteinductive
73f6ccfc98
make Scheduler a fl.Module + Change name Scheduler -> Solver
2024-01-31 17:03:52 +01:00
Pierre Chapuis
7eb8eb4c68
add support for pytorch 2.2 (2.1 is still supported)
...
also bump all dev dependencies to their latest version
2024-01-31 15:03:06 +01:00
Cédric Deltheil
ca5c5a7ca5
add helper for multiple image prompts
2024-01-31 11:03:49 +01:00
Cédric Deltheil
fd01ba910e
fix minor typos in code and docs
2024-01-30 09:52:40 +01:00
Cédric Deltheil
feff4c78ae
segment-anything: fix class name typo
...
Note: weights are impacted
2024-01-30 09:52:40 +01:00
Pierre Chapuis
deb5e930ae
fix exclusions for Downsample and Upsample
...
(also simplify list comprehension for exclusion list)
2024-01-29 11:11:14 +01:00
Pierre Chapuis
83c95fcf44
fix sorting method for LoRA keys
...
- support _out_0
- sort _in before _out
- avoid false positives by only considering suffixes
2024-01-29 11:11:14 +01:00
Pierre Chapuis
ce22c8f51b
fix detection of unet-only LoRAs
2024-01-29 11:11:14 +01:00
Pierre Chapuis
ce0339b4cc
add a get_path
helper to modules
2024-01-26 19:31:13 +01:00
limiteinductive
0ee2d5e075
Fix warmup steps calculation when gradient_accumulation is used
2024-01-25 12:20:36 +01:00
Bryce
12a5439fc4
refactor: rename noise => predicted_noise
...
and in euler, `alt_noise` can now be simply `noise`
2024-01-24 18:15:10 +01:00
Cédric Deltheil
695c24dd3a
image_prompt: remove obsolete comment
...
Not needed anymore since #168 (CrossAttentionAdapter refactoring)
2024-01-24 09:44:00 +01:00
limiteinductive
421da6a3b6
Load Multiple LoRAs with SDLoraManager
2024-01-23 14:12:03 +01:00
Pierre Chapuis
fb2f0e28d4
add rebuild()
to Scheduler interface
...
for use in `set_inference_steps()`
2024-01-23 11:11:50 +01:00
Pierre Chapuis
a5c665462a
add missing constructor arguments to DDPM scheduler
2024-01-23 11:11:50 +01:00
limiteinductive
ed3621362f
Add load_tensors utils in fluxion
2024-01-21 12:34:33 +01:00
Pierre Colle
91aea9b7ff
fix: summarize_tensor(tensor) when tensor.numel() == 0
2024-01-20 14:32:35 +01:00
Pierre Chapuis
f6beee8388
tweak docstring for DDPM
2024-01-19 19:01:02 +01:00
Pierre Chapuis
8a36c8c279
make the first diffusion step a first class property of LDM & Schedulers
2024-01-19 18:52:45 +01:00
Pierre Chapuis
de6266010d
fix typo (sinuosoidal -> sinusoidal)
2024-01-19 14:37:45 +01:00
Pierre Chapuis
d34e36797b
fix typo (sinuosidal -> sinusoidal)
2024-01-19 14:27:37 +01:00
Cédric Deltheil
fde61757fb
summarize_tensor: fix minor warning
...
Calling `tensor.float()` on a complex tensor raises a warning:
UserWarning: Casting complex values to real discards the imaginary
part (Triggered internally at ../aten/src/ATen/native/Copy.cpp:299.)
Follow up of #171
2024-01-19 11:34:47 +01:00
Pierre Chapuis
999e429697
fix bug in dpm_solver_first_order_update
2024-01-18 19:23:11 +01:00
Pierre Chapuis
59db1f0bd5
style
...
- avoid useless multiple assignments
- use coherent variable names
2024-01-18 19:23:11 +01:00
Pierre Chapuis
aaddead17d
DPM: add a mode to use first order for last step
2024-01-18 19:23:11 +01:00
hugojarkoff
17d9701dde
Remove additional noise in final sample of DDIM inference process
2024-01-18 18:43:13 +01:00
limiteinductive
a1f50f3f9d
refactor Lora LoraAdapter and the latent_diffusion/lora file
2024-01-18 16:27:38 +01:00
hugojarkoff
a6a9c8b972
Fix Value dimension in ImageCrossAttention
2024-01-17 16:46:24 +01:00
limiteinductive
2b977bc69e
fix broken self-attention guidance with ip-adapter
...
The #168 and #177 refactorings caused this regression. A new end-to-end
test has been added for proper coverage.
(This fix will be revisited at some point)
2024-01-16 17:21:24 +01:00
limiteinductive
d9ae7ca6a5
cast to float32 before converting to image in tensor_to_image to fix bfloat16 conversion
2024-01-16 11:50:58 +01:00
Colle
457c3f5cbd
display weighted module dtype and device ( #173 )
...
Co-authored-by: Benjamin Trom <benjamintrom@gmail.com>
2024-01-11 22:37:35 +01:00
limiteinductive
14ce2f50f9
make trainer an abstract class
2024-01-11 18:19:18 +01:00
limiteinductive
deed703617
simplify even more CrossAttentionAdapter
...
Following Laurent2916's idea: see #167
2024-01-11 14:47:12 +01:00
limiteinductive
3ab8ed2989
remove unused script field from training BaseConfig
2024-01-11 12:28:47 +01:00
Colle
c141091afc
Make summarize_tensor robust to non-float dtypes ( #171 )
2024-01-11 09:57:58 +01:00
Cédric Deltheil
2b2b6740b7
fix or silent pyright issues
2024-01-10 16:53:06 +01:00
Cédric Deltheil
65f19d192f
ruff fix
2024-01-10 16:53:06 +01:00
Cédric Deltheil
ad143b0867
ruff format
2024-01-10 16:53:06 +01:00
Israfel Salazar
8423c5efa7
feature: Euler scheduler ( #138 )
2024-01-10 11:32:40 +01:00
limiteinductive
c9e973ba41
refactor CrossAttentionAdapter to work with context.
2024-01-08 15:20:23 +01:00
hugojarkoff
00f494efe2
SegmentAnything: add dense mask prompt support
2024-01-05 18:53:25 +01:00
limiteinductive
20c229903f
upgrade pyright to 1.1.342 ; improve no_grad typing
2023-12-29 15:09:02 +01:00
limiteinductive
12eef9cca5
remove default hf_repo from config
2023-12-20 16:58:12 +01:00
limiteinductive
6a1fac876b
remove huggingface datasets from default config
2023-12-20 16:58:12 +01:00
Cédric Deltheil
22ce3fd033
sam: wrap high-level methods with no_grad
2023-12-19 21:45:23 +01:00
Cédric Deltheil
68cc346905
add minimal unit tests for DINOv2
...
To be completed with tests using image preprocessing, e.g. test cosine
similarity on a relevant pair of images
2023-12-18 10:29:28 +01:00
Laureηt
9337d65e0e
feature: add DINOv2
...
Co-authored-by: Benjamin Trom <benjamintrom@gmail.com>
2023-12-14 17:27:32 +01:00
limiteinductive
7d9ceae274
change default behavior of end to None
2023-12-13 17:03:28 +01:00
Cédric Deltheil
82a2aa1ec4
deprecate DDPM step which is unused for now
2023-12-13 15:51:42 +01:00
limiteinductive
a7551e0392
Change fl.Slicing API
2023-12-13 09:38:13 +01:00
Cédric Deltheil
11b0ff6f8c
ddim: remove unused attribute
2023-12-12 17:26:14 +01:00
limiteinductive
7992258dd2
add before/after init callback to trainer
2023-12-12 10:22:55 +01:00
Pierre Chapuis
42a0fc4aa0
fix circular imports
2023-12-11 15:27:11 +01:00
Cédric Deltheil
792a0fc3d9
run lint rules using latest isort settings
2023-12-11 11:58:43 +01:00
Cédric Deltheil
4fc5e427b8
training_utils: fix extra detection
...
Requirements could be, e.g.:
wandb (>=0.15.7,<0.16.0) ; extra == "training"
Or:
wandb>=0.16.0; extra == 'training'
Follow up of 86c5497
2023-12-08 19:09:16 +01:00
limiteinductive
86c54977b9
replace poetry by rye for python dependency management
...
Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
Co-authored-by: Pierre Chapuis <git@catwell.info>
2023-12-08 17:40:10 +01:00
limiteinductive
807ef5551c
refactor fl.Parameter basic layer
...
Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
2023-12-08 10:20:34 +01:00
Cédric Deltheil
46b4b4b462
training_utils: fix naming issue timestep->step
2023-12-05 10:05:34 +01:00
limiteinductive
1075ea4a62
fix ddpm and ddim __init__
2023-12-04 15:27:06 +01:00
limiteinductive
ad8f02e555
add karras sampling to the Scheduler abstract class, default is quadratic
2023-12-04 15:27:06 +01:00
Pierre Chapuis
f22f969d65
remove Black preview mode
...
also fix multiline logs in training
2023-12-04 14:15:56 +01:00
Bryce
4176868e79
feature: add sliced-attention for memory efficiency
...
This allowed me to produce HD images on M1 32gb and 7000x5000 on Nvidia 4090
I saw no visual difference in images generated.
Some datapoints on slice_size
# 4096 max needed for SD 1.5 512x512
# 9216 max needed for SD 1.5 768x768
# 16384 max needed for SD 1.5 1024x1024
# 32400 max needed for SD 1.5 1920x1080 (HD)
# 129600 max needed for SD 1.5 3840x2160 (4k)
# 234375 max needed for SD 1.5 5000x3000
2023-12-01 15:30:23 +01:00
Benjamin Trom
2d4c4774f4
add maxpool to refiners layer
2023-11-20 10:58:53 +01:00
Bryce
f666bc82f5
feature: support self-attention guidance with SD1 inpainting model
2023-11-20 10:17:15 +01:00
Cédric Deltheil
ab0915d052
add tests for FreeU
2023-11-18 16:15:44 +01:00
Benjamin Trom
6eeb01137d
Add Adapter in refiners.fluxion.adapters init
2023-11-18 13:54:40 +01:00
isamu-isozaki
770879a6df
Free U
2023-11-17 17:22:20 +01:00
Cédric Deltheil
fc71e900a0
black
2023-10-21 13:51:06 +02:00
Benjamin Trom
ea44262a39
unnest Residual subchain by modifying its forward
...
And replaced the remaining Sum-Identity layers by Residual.
The tolerance used to compare SAM's ViT models has been tweaked: for
some reasons there is a small difference (in float32) in the neck layer
(first conv2D)
Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
2023-10-19 10:34:51 +02:00
Benjamin Trom
6ddd901767
improve image_to_tensor and tensor_to_image utils
2023-10-17 18:08:58 +02:00
limiteinductive
585c7ad55a
improve consistency of the dpm scheduler
2023-10-12 15:48:43 +02:00
limiteinductive
7a62049d54
implement Restart method for latent diffusion
2023-10-12 15:48:43 +02:00
Cédric Deltheil
ac631bfb2e
make self attention guidance idempotent
...
Follow up of d3365d6
2023-10-11 10:47:22 +02:00
Benjamin Trom
0024191c58
improve debug print for chains
2023-10-10 15:25:09 +02:00
Benjamin Trom
a663375dc7
prevent setattr pytorch module to register on the Chain class
2023-10-10 14:46:15 +02:00
Cédric Deltheil
455be5a4be
remove TODO related to older pyright version
2023-10-10 14:19:47 +02:00
Cédric Deltheil
b80769939d
add support for self-attention guidance
...
See https://arxiv.org/abs/2210.00939
2023-10-09 17:33:15 +02:00
Cédric Deltheil
05126c8f4d
make gaussian_blur work with float16
2023-10-07 21:48:38 +02:00
Cédric Deltheil
7d2abf6fbc
scheduler: add remove noise
...
aka original sample prediction (or predict x0)
E.g. useful for methods like self-attention guidance (see equation (2)
in https://arxiv.org/pdf/2210.00939.pdf )
2023-10-05 17:05:15 +02:00
Cédric Deltheil
0dfa23fa53
fluxion: add gaussian_blur to utils
2023-10-05 16:30:27 +02:00
Cédric Deltheil
f4298f87d2
pad: add optional padding mode
2023-10-05 11:10:37 +02:00
Cédric Deltheil
9b1e25e682
t2i_adapter: minor type annotation fix
2023-10-04 16:28:18 +02:00
Cédric Deltheil
9fbe86fbc9
make set_scale
for T2I-Adapter really dynamic
...
Before this change, `set_scale` had only an impact on the condition
encoder. So calling `set_scale` after `set_condition_features` had no
effect at runtime.
2023-10-04 11:30:09 +02:00
Cédric Deltheil
694661ee10
ip-adapter add set_scale
2023-10-02 11:49:12 +02:00
Cédric Deltheil
5fc6767a4a
add IP-Adapter plus (aka fine-grained features)
2023-09-29 15:23:43 +02:00
Cédric Deltheil
88e454f1cb
Distribute: improve sanity check error message
...
E.g.:
AssertionError: Number of positional arguments (1) must match number of sub-modules (2).
2023-09-28 14:06:06 +02:00
Cédric Deltheil
14864857b1
add T2I-Adapter to foundationals/latent_diffusion
2023-09-25 13:54:26 +02:00
Cédric Deltheil
d72e1d3478
chain: add insert_before_type
2023-09-25 13:54:26 +02:00
Cédric Deltheil
4352e78483
add pixel unshuffle to fluxion's layers
2023-09-25 13:54:26 +02:00
Doryan Kaced
251277a0a8
Fix module registration in IP-Adapter
2023-09-22 17:34:55 +02:00
Pierre Chapuis
72854de669
fix device in DDPM / DDIM timesteps
2023-09-21 17:42:49 +02:00
Pierre Chapuis
fad4f371ea
correctly initialize context in structural_copy
...
fixes a regression introduced in 1cb798e8ae
2023-09-21 12:02:37 +02:00
Pierre Chapuis
cd1fdb5585
fix scheduler device choice
2023-09-21 12:00:19 +02:00
Benjamin Trom
282578ddc0
add Segment Anything (SAM) to foundational models
...
Note: dense prompts (i.e. masks) support is still partial (see MaskEncoder)
Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
2023-09-21 11:44:30 +02:00
Cédric Deltheil
2faff9f57a
ldm: properly resize non-square init image
2023-09-20 10:27:22 +02:00
Benjamin Trom
85095418aa
implement multi_diffusion for SD1 and SDXL
2023-09-19 15:30:50 +02:00
Benjamin Trom
b86521da2f
implement abstract MultiDiffusion class
2023-09-19 15:30:50 +02:00
Cédric Deltheil
e319f13d05
utils: remove inplace opt-in from normalize
2023-09-18 18:07:20 +02:00
Cédric Deltheil
bce3910383
utils: simplify normalize a bit
2023-09-18 18:07:20 +02:00
Cédric Deltheil
d6046e1fbf
move image tensor normalize under fluxion's utils
2023-09-18 18:07:20 +02:00
Benjamin Trom
dc1fc239aa
show an ellipsis when chain has been shortened because of depth and count siblings with same class name
2023-09-15 02:08:50 +02:00
Benjamin Trom
1cb798e8ae
remove structural_attrs
2023-09-14 14:49:06 +02:00
Benjamin Trom
121ef4df39
add is_optimized option for attention
2023-09-14 14:12:27 +02:00
Pierre Chapuis
0e0c39b4b5
black
2023-09-13 17:02:47 +02:00
Cédric Deltheil
eea340c6c4
add support for SDXL IP-Adapter
...
This only supports the latest SDXL IP-Adapter release (2023.9.8) which
builds upon the ViT-H/14 CLIP image encoder.
2023-09-12 18:00:39 +02:00
Cédric Deltheil
1b4dcebe06
make scheduler an actual abstract base class
2023-09-12 16:47:47 +02:00
Cédric Deltheil
12e37f5d85
controlnet: replace Lambda w/ Slicing basic layer
2023-09-12 15:37:33 +02:00
Pierre Chapuis
7a32699cc6
add ensure_find and ensure_find_parent helpers
2023-09-12 14:19:10 +02:00
Pierre Chapuis
dc2c3e0163
implement CrossAttentionAdapter using chain operations
2023-09-12 11:58:24 +02:00
Pierre Chapuis
3c056e2231
expose lookup_top_adapter
2023-09-12 11:58:24 +02:00
Benjamin Trom
b515c02867
add new basic layers and Matmul chain
2023-09-12 10:55:34 +02:00
Doryan Kaced
2f2510a9b1
Use bias correction on Prodigy
2023-09-12 10:44:05 +02:00
Pierre Chapuis
be54cfc016
fix weight loading for float16 LoRAs
2023-09-11 16:14:19 +02:00
Cédric Deltheil
e5425e2968
make IP-Adapter generic for SD1 and SDXL
2023-09-08 16:38:01 +02:00
Cédric Deltheil
61858d9371
add CLIPImageEncoderG
2023-09-08 12:00:21 +02:00
Cédric Deltheil
c6fadd1c81
deprecate bidirectional_mapping util
2023-09-07 18:43:20 +02:00
limiteinductive
2786117469
implement SDXL + e2e test on random init
2023-09-07 18:34:42 +02:00
limiteinductive
02af8e9f0b
improve typing of ldm and sd1, introducing SD1Autoencoder class
2023-09-07 18:34:42 +02:00
Benjamin Trom
cf43cb191f
Add better tree representation for fluxion Module
2023-09-07 16:33:24 +02:00
Cédric Deltheil
c55917e293
add IP-Adapter support for SD 1.5
...
Official repo: https://github.com/tencent-ailab/IP-Adapter
2023-09-06 15:12:48 +02:00
Pierre Chapuis
864937a776
support injecting several LoRAs simultaneously
2023-09-06 11:49:55 +02:00
limiteinductive
88efa117bf
fix model comparison with custom layers
2023-09-05 12:34:38 +02:00
Pierre Chapuis
566656a539
fix text encoder LoRAs
2023-09-04 15:51:39 +02:00
limiteinductive
ebfa51f662
Make breakpoint a ContextModule
2023-09-04 12:22:10 +02:00
limiteinductive
9d2fbf6dbd
Fix tuple annotation for pyright 1.1.325
2023-09-04 10:41:06 +02:00
Doryan Kaced
44e184d4d5
Init dtype and device correctly for OutputBlock
2023-09-01 19:44:06 +02:00
Cédric Deltheil
3a10baa9f8
cross-attn 2d: record use_bias attribute
2023-09-01 19:23:33 +02:00
Cédric Deltheil
b933fabf31
unet: get rid of clip_embedding attribute for SD1
...
It is implicitly defined by the underlying cross-attention layer. This
also makes it consistent with SDXL.
2023-09-01 19:23:33 +02:00
Cédric Deltheil
134ee7b754
sdxl: remove wrong structural_attrs in cross-attn
2023-09-01 19:23:33 +02:00
Pierre Chapuis
e91e31ebd2
check no two controlnets have the same name
2023-09-01 17:47:29 +02:00
Pierre Chapuis
bd59790e08
always respect _can_refresh_parent
2023-09-01 17:44:16 +02:00
Pierre Chapuis
d389d11a06
make basic adapters a part of Fluxion
2023-09-01 17:29:48 +02:00
Pierre Chapuis
31785f2059
scope range adapter in latent diffusion
2023-09-01 17:29:48 +02:00
Pierre Chapuis
73813310d0
rename SelfAttentionInjection to ReferenceOnlyControl and vice-versa
2023-09-01 17:29:48 +02:00
Pierre Chapuis
eba0c33001
allow lora_targets to take a list of targets as input
2023-09-01 11:52:39 +02:00
Cédric Deltheil
92cdf19eae
add Distribute to fluxion layers's __init__.py
2023-09-01 11:20:48 +02:00