hugojarkoff
a93ceff752
Add HQ-SAM Adapter
2024-03-21 15:36:55 +01:00
Pierre Chapuis
5d784bedab
add test for "Adapting SDXL" guide
2024-03-08 15:43:57 +01:00
Pierre Chapuis
cce2a98fa6
add sanity check to auto_attach_loras
2024-03-08 15:43:57 +01:00
Pierre Chapuis
be2368cf20
ruff 3 formatting (Rye 0.28)
2024-03-08 10:42:05 +01:00
Pierre Chapuis
7d8e3fc1db
add SDXL-Lightning weights to conversion script + support safetensors
2024-02-26 12:14:02 +01:00
Pierre Chapuis
d14c5bd5f8
add option to override unet weights for conversion
2024-02-26 12:14:02 +01:00
Pierre Chapuis
8f614e7647
check hash of downloaded LoRA weights, update DPO refs
...
(the DPO LoRA weights have changed: 2699b36e22
)
2024-02-23 12:02:18 +01:00
Laurent
28f9368c93
(weight conversion) fix typo in ControlLora export folder
2024-02-22 17:59:36 +01:00
Pierre Chapuis
03b79d6d34
rename ResidualBlock to ConditionScaleBlock in LCM
2024-02-21 16:37:27 +01:00
Pierre Chapuis
684e2b9a47
add docstrings for LCM / LCM-LoRA
2024-02-21 16:37:27 +01:00
Pierre Chapuis
b55e9332fe
add LCM and LCM-LoRA to tests weights conversion script
2024-02-21 16:37:27 +01:00
Pierre Chapuis
f8d55ccb20
add LcmAdapter
...
This adds support for the condition scale embedding.
Also updates the UNet converter to support LCM.
2024-02-21 16:37:27 +01:00
Pierre Chapuis
8139b2dd91
fix IP-Adapter weights conversion
2024-02-21 15:03:48 +01:00
Laurent
00270604ef
fix conversion_script bug + rename control_lora e2e test
2024-02-14 18:20:46 +01:00
Laurent
5fee723cd1
write ControlLora weight conversion script
2024-02-14 18:20:46 +01:00
limiteinductive
2e526d35d1
Make Dataset part of the trainer
2024-02-07 16:13:01 +01:00
limiteinductive
2ef4982e04
remove wandb from base config
2024-02-07 11:06:59 +01:00
Pierre Chapuis
471ef91d1c
make __getattr__
on Module return object, not Any
...
PyTorch chose to make it Any because they expect its users' code
to be "highly dynamic": https://github.com/pytorch/pytorch/pull/104321
It is not the case for us, in Refiners having untyped code
goes contrary to one of our core principles.
Note that there is currently an open PR in PyTorch to
return `Module | Tensor`, but in practice this is not always
correct either: https://github.com/pytorch/pytorch/pull/115074
I also moved Residuals-related code from SD1 to latent_diffusion
because SDXL should not depend on SD1.
2024-02-06 11:32:18 +01:00
limiteinductive
73f6ccfc98
make Scheduler a fl.Module + Change name Scheduler -> Solver
2024-01-31 17:03:52 +01:00
Cédric Deltheil
feff4c78ae
segment-anything: fix class name typo
...
Note: weights are impacted
2024-01-30 09:52:40 +01:00
limiteinductive
421da6a3b6
Load Multiple LoRAs with SDLoraManager
2024-01-23 14:12:03 +01:00
limiteinductive
ed3621362f
Add load_tensors utils in fluxion
2024-01-21 12:34:33 +01:00
Cédric Deltheil
2b4bc77534
prepare_test_weights: update segment-anything hash
...
Follow up of 00f494e
2024-01-20 14:05:45 +01:00
limiteinductive
a1f50f3f9d
refactor Lora LoraAdapter and the latent_diffusion/lora file
2024-01-18 16:27:38 +01:00
Cédric Deltheil
dd87b9706e
pick the right class for CLIP text converter
...
i.e. CLIPTextModel by default or CLIPTextModelWithProjection for SDXL
so-called text_encoder_2
This silent false positive warnings like:
Some weights of CLIPTextModelWithProjection were not initialized
from the model checkpoint [...]
2024-01-18 11:17:41 +01:00
Pierre Chapuis
7839c54ae8
unet conversion: add option to skip init check
2024-01-16 19:10:59 +01:00
Pierre Chapuis
d2f38871fd
add a way to specify the subfolder of the unet
...
(no subfolder -> pass an empty string)
2024-01-16 19:10:59 +01:00
Pierre Chapuis
94a918a474
fix invalid default value for --half in help
2024-01-16 19:10:59 +01:00
limiteinductive
2b977bc69e
fix broken self-attention guidance with ip-adapter
...
The #168 and #177 refactorings caused this regression. A new end-to-end
test has been added for proper coverage.
(This fix will be revisited at some point)
2024-01-16 17:21:24 +01:00
Cédric Deltheil
eafbc8a99a
prepare_test_weights: refresh IP-Adapter hashes
2024-01-11 14:47:12 +01:00
limiteinductive
c9e973ba41
refactor CrossAttentionAdapter to work with context.
2024-01-08 15:20:23 +01:00
hugojarkoff
00f494efe2
SegmentAnything: add dense mask prompt support
2024-01-05 18:53:25 +01:00
limiteinductive
20c229903f
upgrade pyright to 1.1.342 ; improve no_grad typing
2023-12-29 15:09:02 +01:00
limiteinductive
6a1fac876b
remove huggingface datasets from default config
2023-12-20 16:58:12 +01:00
Cédric Deltheil
f0ea1a2509
prepare_test_weights: add DINOv2
2023-12-18 10:29:28 +01:00
Cédric Deltheil
832f012fe4
convert_dinov2: tweak command-line args
...
i.e. mimic the other conversion scripts
2023-12-18 10:29:28 +01:00
Bryce
5ca1549c96
refactor: convert bash script to python
...
Ran successfully to completion. But on a repeat run `convert_unclip` didn't pass the hash check for some reason.
- fix inpainting model download urls
- shows a progress bar for downloads
- skips downloading existing files
- uses a temporary file to prevent partial downloads
- can do a dry run to check if url is valid `DRY_RUN=1 python scripts/prepare_test_weights.py`
- displays the downloaded file hash
2023-12-15 09:55:59 +01:00
Cédric Deltheil
e978b3665d
convert_dinov2: ignore pyright errors
...
And save converted weights into safetensors instead of pickle
2023-12-14 17:50:41 +01:00
Laureηt
9337d65e0e
feature: add DINOv2
...
Co-authored-by: Benjamin Trom <benjamintrom@gmail.com>
2023-12-14 17:27:32 +01:00
Cédric Deltheil
792a0fc3d9
run lint rules using latest isort settings
2023-12-11 11:58:43 +01:00
limiteinductive
86c54977b9
replace poetry by rye for python dependency management
...
Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
Co-authored-by: Pierre Chapuis <git@catwell.info>
2023-12-08 17:40:10 +01:00
limiteinductive
807ef5551c
refactor fl.Parameter basic layer
...
Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
2023-12-08 10:20:34 +01:00
Benjamin Trom
ea44262a39
unnest Residual subchain by modifying its forward
...
And replaced the remaining Sum-Identity layers by Residual.
The tolerance used to compare SAM's ViT models has been tweaked: for
some reasons there is a small difference (in float32) in the neck layer
(first conv2D)
Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
2023-10-19 10:34:51 +02:00
Pierre Chapuis
976b55aea5
add test weights conversion script
2023-10-09 14:18:40 +02:00
Cédric Deltheil
7f7e129bb6
convert autoencoder: add an option for subfolder
2023-09-29 18:54:24 +02:00
Cédric Deltheil
5fc6767a4a
add IP-Adapter plus (aka fine-grained features)
2023-09-29 15:23:43 +02:00
Cédric Deltheil
2106c237d9
add T2I-Adapter conversion script
2023-09-25 13:54:26 +02:00
Benjamin Trom
282578ddc0
add Segment Anything (SAM) to foundational models
...
Note: dense prompts (i.e. masks) support is still partial (see MaskEncoder)
Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
2023-09-21 11:44:30 +02:00
Cédric Deltheil
eea340c6c4
add support for SDXL IP-Adapter
...
This only supports the latest SDXL IP-Adapter release (2023.9.8) which
builds upon the ViT-H/14 CLIP image encoder.
2023-09-12 18:00:39 +02:00
Pierre Chapuis
7a32699cc6
add ensure_find and ensure_find_parent helpers
2023-09-12 14:19:10 +02:00