Pierre Chapuis
ae5ef4f14c
make sure adapters can access the context of their parent chain
...
https://github.com/finegrain-ai/refiners/issues/456
2024-10-03 10:51:11 +02:00
Laurent
444882a734
move some tests into the adapters test folder
CI / lint_and_typecheck (push) Waiting to run
Deploy docs to GitHub Pages / Deploy docs (push) Waiting to run
Spell checker / Spell check (push) Waiting to run
2024-09-09 17:44:26 +02:00
ily-R
277b0fd837
ella adapter implementation. tested with sd1.5 model
2024-09-04 11:38:22 +02:00
Pierre Chapuis
2ba83f575e
switch pytest import mode to importlib
...
see:
https://docs.pytest.org/en/7.1.x/explanation/goodpractices.html#choosing-an-import-mode
https://docs.pytest.org/en/7.1.x/explanation/pythonpath.html#import-modes
This change fixes the SAM tests.
2024-06-24 17:19:05 +02:00
Laurent
7e64ba4011
modify ip_adapter's CrossAttentionAdapters injection logic
2024-03-26 11:15:04 +01:00
Pierre Chapuis
be2368cf20
ruff 3 formatting (Rye 0.28)
2024-03-08 10:42:05 +01:00
Cédric Deltheil
176807740b
control_lora: fix adapter set scale
...
The adapter set scale did not propagate the scale to the underlying
zero convolutions. The value set at CTOR time was used instead.
Follow up of #285
2024-02-22 10:01:05 +01:00
Laurent
60c0780fe7
write StyleAligned
inject/eject tests
2024-02-15 15:22:47 +01:00
Pierre Chapuis
37425fb609
make LoRA generic
2024-02-06 11:32:18 +01:00
Pierre Chapuis
471ef91d1c
make __getattr__
on Module return object, not Any
...
PyTorch chose to make it Any because they expect its users' code
to be "highly dynamic": https://github.com/pytorch/pytorch/pull/104321
It is not the case for us, in Refiners having untyped code
goes contrary to one of our core principles.
Note that there is currently an open PR in PyTorch to
return `Module | Tensor`, but in practice this is not always
correct either: https://github.com/pytorch/pytorch/pull/115074
I also moved Residuals-related code from SD1 to latent_diffusion
because SDXL should not depend on SD1.
2024-02-06 11:32:18 +01:00
Pierre Chapuis
f4ed7254fa
test IP adapter scale setter
2024-02-01 16:17:07 +01:00
Pierre Chapuis
8341d3a74b
use float16, save memory
2024-02-01 16:17:07 +01:00
Pierre Chapuis
d185711bc5
add tests based on repr for inject / eject
2024-02-01 16:17:07 +01:00
Pierre Chapuis
0e77ef1720
add inject / eject test for concept extender (+ better errors)
2024-02-01 16:17:07 +01:00
Pierre Chapuis
93270ec2d7
add inject / eject test for t2i adapter
2024-02-01 16:17:07 +01:00
limiteinductive
421da6a3b6
Load Multiple LoRAs with SDLoraManager
2024-01-23 14:12:03 +01:00
limiteinductive
a1f50f3f9d
refactor Lora LoraAdapter and the latent_diffusion/lora file
2024-01-18 16:27:38 +01:00
Cédric Deltheil
792a0fc3d9
run lint rules using latest isort settings
2023-12-11 11:58:43 +01:00
Pierre Chapuis
547a73e67a
clarify the "adapting when a LoRA is injected" issue in tests
2023-09-06 11:49:55 +02:00
Pierre Chapuis
864937a776
support injecting several LoRAs simultaneously
2023-09-06 11:49:55 +02:00
Pierre Chapuis
d389d11a06
make basic adapters a part of Fluxion
2023-09-01 17:29:48 +02:00
Pierre Chapuis
31785f2059
scope range adapter in latent diffusion
2023-09-01 17:29:48 +02:00
Pierre Chapuis
0f476ea18b
make high-level adapters Adapters
...
This generalizes the Adapter abstraction to higher-level
constructs such as high-level LoRA (targeting e.g. the
SD UNet), ControlNet and Reference-Only Control.
Some adapters now work by adapting child models with
"sub-adapters" that they inject / eject when needed.
2023-08-31 10:57:18 +02:00
Pierre Chapuis
1065dfe10b
add empty __init__.py files to make pytest happy
...
(otherwise it wants unique file basenames)
2023-08-23 17:49:59 +02:00
Cédric Deltheil
48f674c433
initial commit
2023-08-04 15:28:41 +02:00