Commit graph

349 commits

Author SHA1 Message Date
Laurent 6ddd5435b8 fix broken dtypes in tiled auto encoders 2024-07-11 15:23:02 +02:00
Laurent f3b5c8d3e1 create SD1.5 MultiUpscaler pipeline
Co-authored-by: limiteinductive <benjamin@lagon.tech>
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-07-11 15:23:02 +02:00
Laurent 66cd0d57a1 improve MultiDiffusion pipelines
Co-authored-by: limiteinductive <benjamin@lagon.tech>
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-07-11 15:23:02 +02:00
Pierre Chapuis 9e8c2a3753 add FrankenSolver
This solver is designed to use Diffusers Schedulers as Refiners Solvers.
2024-07-10 19:31:34 +02:00
Cédric Deltheil 245f51393e auto_encoder: swap x/y names when generating tiles
Cosmetic change
2024-06-26 14:17:28 +02:00
limiteinductive b42881e54e Implement Tiled Autoencoder inference to save VRAM 2024-06-26 11:59:18 +02:00
limiteinductive 15ccdb38f3 Add scale_decay parameter for SD1 ControlNet 2024-06-24 13:21:27 +02:00
Pierre Chapuis 98de9d13d3 fix typing problems 2024-06-24 11:12:29 +02:00
Pierre Chapuis 1886e6456c add a docstring for set_inference_steps
This explains the relation between first_step and strength,
as shown by @holwech here: https://github.com/finegrain-ai/refiners/discussions/374
2024-05-29 16:20:50 +02:00
Cédric Deltheil ddcd84c740 training_utils: export neptune config and mixin
Like W&B. With this, #371 would have broken `test_import_training_utils`.

Follow up of #371 and #372.
2024-05-29 10:29:34 +02:00
Laurent 7dba1c8034 (training_utils) neptune callback pydantic fix 2024-05-28 22:42:28 +02:00
Laurent 3ad7f592db (training_utils) add NeptuneCallback 2024-05-28 16:35:11 +02:00
limiteinductive 3a7f14e4dc Fix clock log order ; fix that the first iteration was skipped 2024-05-21 17:57:14 +02:00
limiteinductive 0bec9a855d annotated validators for TimeValue 2024-05-09 10:53:58 +02:00
limiteinductive 22f4f4faf1 DataLoader validation 2024-05-09 10:53:58 +02:00
limiteinductive 38bddc49bd implement data_iterable 2024-05-09 10:53:58 +02:00
Benjamin Trom 05a63ef44e apply suggestions from code review 2024-05-09 10:53:58 +02:00
limiteinductive d6c225a112 implement data_iterable (bis) 2024-05-09 10:53:58 +02:00
limiteinductive de8334b6fc remove dataset length 2024-05-09 10:53:58 +02:00
limiteinductive b497b27cd3 remove dataset length (bis) 2024-05-09 10:53:58 +02:00
limiteinductive 603c8abb1e fix clock 2024-05-09 10:53:58 +02:00
limiteinductive 44760ac19f deprecate evaluation 2024-05-09 10:53:58 +02:00
limiteinductive 061d44888f batch to step 2024-05-09 10:53:58 +02:00
limiteinductive b7bb8bba80 remove EventConfig
This is a partial rollback of commit 5dde281
2024-05-09 10:53:58 +02:00
Laurent 7aff743019 initialize StableDiffusion_1_Inpainting with a 9 channel SD1Unet if not provided 2024-04-23 16:58:22 +02:00
limiteinductive f32ccc3474 Remove seed_everything logging because it is too verbose 2024-04-22 18:14:33 +02:00
limiteinductive 5dde281ada Implement EventConfig 2024-04-22 18:14:33 +02:00
limiteinductive 446796da57 Refactor TimeValue 2024-04-18 20:58:47 +02:00
Laurent 17246708b9 Add sample_noise staticmethod and modify add_noise to support batched steps 2024-04-18 12:55:49 +02:00
limiteinductive 7427c171f6 fix training_utils requirements check 2024-04-17 18:10:28 +02:00
Pierre Colle bf7852b88e SAM: image_to_scaled_tensor gray images 2024-04-16 18:45:17 +02:00
Laurent eb4bb34f8b (training_utils) add new ForceCommit callback 2024-04-16 14:43:10 +02:00
limiteinductive be7d065a33 Add DataloadeConfig to Trainer 2024-04-15 20:56:19 +02:00
limiteinductive b9b999ccfe turn scoped_seed into a context manager 2024-04-13 15:03:35 +02:00
Pierre Colle 64692c3b5b TrainerClock: assert dataset_length >= batch_size 2024-04-12 15:05:52 +02:00
Pierre Colle 0ac290f67d SAM: expose sizing helpers 2024-04-12 08:56:23 +02:00
Laurent 06ff2f0a5f add support for dinov2 giant flavors 2024-04-11 14:48:33 +02:00
Laurent 04e59bf3d9 fix GLU Activation docstrings 2024-04-11 14:48:33 +02:00
limiteinductive f26b6ee00a add static typing to __call__ method for latent_diffusion models ; fix multi_diffusion bug that wasn't taking guidance_scale into account 2024-04-11 12:13:30 +02:00
Cédric Deltheil a2ee705783 hq sam: add constructor args to docstring
Additionally, mark `register_adapter_module` for internal use.
2024-04-08 11:46:37 +02:00
Pierre Colle d05ebb8dd3 SAM/HQSAMAdapter: docstring examples 2024-04-08 07:12:57 +02:00
hugojarkoff bbb46e3fc7 Fix clock step inconsistencies on batch end 2024-04-05 15:52:43 +02:00
Pierre Chapuis 09af570b23 add DINOv2-FD metric 2024-04-03 16:45:00 +02:00
Laurent 5f07fa9c21 fix dinov2 interpolation, support batching 2024-04-02 18:57:25 +02:00
Pierre Chapuis fd5a15c7e0 update pyright and fix Pillow 10.3 typing issues 2024-04-02 18:15:52 +02:00
Laurent 4f94dfb494 implement dinov2 positional embedding interpolation 2024-04-02 10:02:43 +02:00
Laurent 0336bc78b5 simplify interpolate function and layer 2024-04-02 10:02:43 +02:00
Pierre Colle 6c37e3f933 hq-sam : weights/load_weights 2024-03-29 11:25:43 +01:00
Pierre Chapuis 404a15aad2 tweak auto_attach_loras so debugging is easier when it fails 2024-03-26 16:12:48 +01:00
Laurent a0715806d2 modify ip_adapter's ImageCrossAttention scale getter and setter
this new version makes it robust in case mulitple Mulitply-s are inside the Chain (e.g. if the Linear layers are LoRA-ified)
2024-03-26 11:15:04 +01:00