Pierre Colle
0ac290f67d
SAM: expose sizing helpers
2024-04-12 08:56:23 +02:00
Laurent
06ff2f0a5f
add support for dinov2 giant flavors
2024-04-11 14:48:33 +02:00
limiteinductive
f26b6ee00a
add static typing to __call__ method for latent_diffusion models ; fix multi_diffusion bug that wasn't taking guidance_scale into account
2024-04-11 12:13:30 +02:00
Pierre Chapuis
09af570b23
add DINOv2-FD metric
2024-04-03 16:45:00 +02:00
Laurent
2ecf7e4b8c
skip dinov2 float16 test on cpu + test dinov2 when batch_size>1
2024-04-02 18:57:25 +02:00
Pierre Chapuis
fd5a15c7e0
update pyright and fix Pillow 10.3 typing issues
2024-04-02 18:15:52 +02:00
Laurent
1a8ea9180f
refactor dinov2 tests, check against official implementation
2024-04-02 10:02:43 +02:00
Pierre Colle
6c37e3f933
hq-sam : weights/load_weights
2024-03-29 11:25:43 +01:00
Laurent
7e64ba4011
modify ip_adapter's CrossAttentionAdapters injection logic
2024-03-26 11:15:04 +01:00
Pierre Colle
cba83b0558
SAM init with mask_decoder after #325
2024-03-24 20:18:57 +01:00
Pierre Colle
5c937b184a
HQ-SAM logit equal test, following #331
2024-03-23 21:58:32 +01:00
Pierre Colle
2763db960e
SAM e2e test tolerance explained
2024-03-22 21:31:28 +01:00
Pierre Chapuis
364e196874
support no CFG in compute_clip_text_embedding
2024-03-22 17:06:51 +01:00
hugojarkoff
a93ceff752
Add HQ-SAM Adapter
2024-03-21 15:36:55 +01:00
hugojarkoff
c6b5eb24a1
Add logits comparison for base SAM in single mask output prediction mode
2024-03-21 10:48:48 +01:00
limiteinductive
38c86f59f4
Switch gradient clipping to native torch torch.nn.utils.clip_grad_norm_
2024-03-19 22:08:48 +01:00
Pierre Colle
68fe725767
Add multimask_output flag to SAM
2024-03-19 17:40:26 +01:00
limiteinductive
6a72943ff7
change TimeValue to a dataclass
2024-03-19 14:49:24 +01:00
Pierre Chapuis
5d784bedab
add test for "Adapting SDXL" guide
2024-03-08 15:43:57 +01:00
Pierre Chapuis
72fa17df48
fix slider loras test
2024-03-08 15:43:57 +01:00
Pierre Chapuis
8c7fcbc00f
LoRA manager: move exclude / include to add_loras call
...
Always exclude the TimestepEncoder by default.
This is because some keys include both e.g. `resnet` and `time_emb_proj`.
Preprocess blocks that tend to mix up with others in a separate
auto_attach call.
2024-03-08 15:43:57 +01:00
Pierre Chapuis
052a20b897
remove add_multiple_loras
2024-03-08 15:43:57 +01:00
Pierre Chapuis
c383ff6cf4
fix DPO LoRA loading in tests
2024-03-08 15:43:57 +01:00
Pierre Chapuis
1eb71077aa
use same scale setter / getter interface for all controls
2024-03-08 11:29:28 +01:00
Pierre Chapuis
be2368cf20
ruff 3 formatting (Rye 0.28)
2024-03-08 10:42:05 +01:00
Pierre Chapuis
a0be5458b9
snip long prompt in tests
2024-03-05 19:54:44 +01:00
Pierre Chapuis
d5d199edc5
add tests for SDXL Lightning
2024-02-26 12:14:02 +01:00
Pierre Chapuis
7e4e0f0650
correctly scale init latents for Euler scheduler
2024-02-26 12:14:02 +01:00
Pierre Chapuis
bf0ba58541
refactor solver params, add sample prediction type
2024-02-26 12:14:02 +01:00
Pierre Chapuis
ddc1cf8ca7
refactor solvers to support different timestep spacings
2024-02-26 12:14:02 +01:00
Pierre Chapuis
8f614e7647
check hash of downloaded LoRA weights, update DPO refs
...
(the DPO LoRA weights have changed: 2699b36e22
)
2024-02-23 12:02:18 +01:00
Cédric Deltheil
176807740b
control_lora: fix adapter set scale
...
The adapter set scale did not propagate the scale to the underlying
zero convolutions. The value set at CTOR time was used instead.
Follow up of #285
2024-02-22 10:01:05 +01:00
Pierre Chapuis
684e2b9a47
add docstrings for LCM / LCM-LoRA
2024-02-21 16:37:27 +01:00
Pierre Chapuis
383c3c8a04
add tests for LCM and LCM-LoRA
...
(As of now LoRA with guidance > 1 and especially base do not pass with those tolerances.)
2024-02-21 16:37:27 +01:00
Pierre Chapuis
c8c6294550
add LCMSolver (Latent Consistency Models)
2024-02-21 16:37:27 +01:00
Cédric Deltheil
446967859d
test_style_aligned: switch to CLIP text batch API
...
Added in #263
2024-02-21 16:33:03 +01:00
Pierre Colle
d199cd4f24
batch sdxl + sd1 + compute_clip_text_embedding
...
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-21 15:17:11 +01:00
Cédric Deltheil
5ab5d7fd1c
import ControlLoraAdapter part of latent_diffusion
2024-02-19 14:11:32 +01:00
Laurent
da3c3602fb
write StyleAligned
e2e test
2024-02-15 15:22:47 +01:00
Laurent
60c0780fe7
write StyleAligned
inject/eject tests
2024-02-15 15:22:47 +01:00
limiteinductive
432e32f94f
rename Scheduler -> LRScheduler
2024-02-15 11:48:36 +01:00
Laurent
00270604ef
fix conversion_script bug + rename control_lora e2e test
2024-02-14 18:20:46 +01:00
Laurent
7fe392298a
write ControlLora
e2e tests
2024-02-14 18:20:46 +01:00
limiteinductive
bec845553f
update deprecated validator for field_validator
2024-02-13 18:35:51 +01:00
limiteinductive
ab506b4db2
fix bug that was causing double registration
2024-02-13 11:12:13 +01:00
limiteinductive
3488273f50
Enforce correct subtype for the config param in both decorators
...
Also add a custom ModelConfig for the MockTrainer test
Update src/refiners/training_utils/config.py
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-12 16:21:04 +01:00
limiteinductive
cef8a9936c
refactor register_model decorator
2024-02-12 16:21:04 +01:00
limiteinductive
d6546c9026
add @register_model and @register_callback decorators
...
Refactor ClockTrainer to include Callback
2024-02-12 10:24:19 +01:00
limiteinductive
f541badcb3
Allow optional train ModelConfig + forbid extra input for configs
2024-02-10 16:13:10 +01:00
Pierre Colle
25bfa78907
lr, betas, eps, weight_decay at model level
...
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-09 12:05:13 +01:00