Commit graph

644 commits

Author SHA1 Message Date
Pierre Chapuis 98de9d13d3 fix typing problems 2024-06-24 11:12:29 +02:00
Pierre Chapuis d1fc845bc2 generate doc for LatentDiffusionModel 2024-05-29 16:51:58 +02:00
Pierre Chapuis 1886e6456c add a docstring for set_inference_steps
This explains the relation between first_step and strength,
as shown by @holwech here: https://github.com/finegrain-ai/refiners/discussions/374
2024-05-29 16:20:50 +02:00
Cédric Deltheil ddcd84c740 training_utils: export neptune config and mixin
Like W&B. With this, #371 would have broken `test_import_training_utils`.

Follow up of #371 and #372.
2024-05-29 10:29:34 +02:00
Laurent 7dba1c8034 (training_utils) neptune callback pydantic fix 2024-05-28 22:42:28 +02:00
Laurent 3ad7f592db (training_utils) add NeptuneCallback 2024-05-28 16:35:11 +02:00
limiteinductive 3a7f14e4dc Fix clock log order ; fix that the first iteration was skipped 2024-05-21 17:57:14 +02:00
Benjamin Trom cc7b62f090 apply suggestions from code review 2024-05-09 10:53:58 +02:00
limiteinductive 76a6ce8641 update Training 101 2024-05-09 10:53:58 +02:00
limiteinductive 0bec9a855d annotated validators for TimeValue 2024-05-09 10:53:58 +02:00
limiteinductive 22f4f4faf1 DataLoader validation 2024-05-09 10:53:58 +02:00
limiteinductive 38bddc49bd implement data_iterable 2024-05-09 10:53:58 +02:00
Benjamin Trom 05a63ef44e apply suggestions from code review 2024-05-09 10:53:58 +02:00
limiteinductive d6c225a112 implement data_iterable (bis) 2024-05-09 10:53:58 +02:00
limiteinductive de8334b6fc remove dataset length 2024-05-09 10:53:58 +02:00
limiteinductive b497b27cd3 remove dataset length (bis) 2024-05-09 10:53:58 +02:00
Benjamin Trom 1db0845db2 update test_trainer.py 2024-05-09 10:53:58 +02:00
limiteinductive 603c8abb1e fix clock 2024-05-09 10:53:58 +02:00
limiteinductive 44760ac19f deprecate evaluation 2024-05-09 10:53:58 +02:00
limiteinductive 061d44888f batch to step 2024-05-09 10:53:58 +02:00
limiteinductive b7bb8bba80 remove EventConfig
This is a partial rollback of commit 5dde281
2024-05-09 10:53:58 +02:00
Laurent 7aff743019 initialize StableDiffusion_1_Inpainting with a 9 channel SD1Unet if not provided 2024-04-23 16:58:22 +02:00
limiteinductive f32ccc3474 Remove seed_everything logging because it is too verbose 2024-04-22 18:14:33 +02:00
limiteinductive 5dde281ada Implement EventConfig 2024-04-22 18:14:33 +02:00
limiteinductive 07985694ed fix training_101 import 2024-04-18 20:58:47 +02:00
limiteinductive 446796da57 Refactor TimeValue 2024-04-18 20:58:47 +02:00
Laurent 17246708b9 Add sample_noise staticmethod and modify add_noise to support batched steps 2024-04-18 12:55:49 +02:00
limiteinductive 7427c171f6 fix training_utils requirements check 2024-04-17 18:10:28 +02:00
Pierre Colle bf7852b88e SAM: image_to_scaled_tensor gray images 2024-04-16 18:45:17 +02:00
limiteinductive f48712ee29 lint 2024-04-16 16:24:33 +02:00
limiteinductive 347fdbc794 add init file for segment_anything tests 2024-04-16 16:24:33 +02:00
Laurent eb4bb34f8b (training_utils) add new ForceCommit callback 2024-04-16 14:43:10 +02:00
limiteinductive be7d065a33 Add DataloadeConfig to Trainer 2024-04-15 20:56:19 +02:00
limiteinductive b9b999ccfe turn scoped_seed into a context manager 2024-04-13 15:03:35 +02:00
Pierre Colle 64692c3b5b TrainerClock: assert dataset_length >= batch_size 2024-04-12 15:05:52 +02:00
Pierre Colle 0ac290f67d SAM: expose sizing helpers 2024-04-12 08:56:23 +02:00
Laurent 06ff2f0a5f add support for dinov2 giant flavors 2024-04-11 14:48:33 +02:00
Laurent 04e59bf3d9 fix GLU Activation docstrings 2024-04-11 14:48:33 +02:00
limiteinductive f26b6ee00a add static typing to __call__ method for latent_diffusion models ; fix multi_diffusion bug that wasn't taking guidance_scale into account 2024-04-11 12:13:30 +02:00
Cédric Deltheil a2ee705783 hq sam: add constructor args to docstring
Additionally, mark `register_adapter_module` for internal use.
2024-04-08 11:46:37 +02:00
Pierre Colle d05ebb8dd3 SAM/HQSAMAdapter: docstring examples 2024-04-08 07:12:57 +02:00
Pierre Chapuis e033306f60 use factories in context example
Using the same instance multiple times is a bad idea
because PyTorch memorizes things internally. Among
other things this breaks Chain's `__repr__`.
2024-04-05 18:06:54 +02:00
hugojarkoff bbb46e3fc7 Fix clock step inconsistencies on batch end 2024-04-05 15:52:43 +02:00
Pierre Chapuis 09af570b23 add DINOv2-FD metric 2024-04-03 16:45:00 +02:00
Cédric Deltheil c529006d13 get rid of invisible-watermark test dependency
Was needed originally for diffusers' StableDiffusionXLPipeline. It has
been relaxed in the meanwhile (see `add_watermarker` for details).
2024-04-03 14:48:56 +02:00
Laurent 2ecf7e4b8c skip dinov2 float16 test on cpu + test dinov2 when batch_size>1 2024-04-02 18:57:25 +02:00
Laurent 5f07fa9c21 fix dinov2 interpolation, support batching 2024-04-02 18:57:25 +02:00
Laurent ef427538a6 revert "arange" typo ignore 2024-04-02 18:18:22 +02:00
Pierre Chapuis 6c40f56c3f make numpy dependency explicit 2024-04-02 18:15:52 +02:00
Pierre Chapuis fd5a15c7e0 update pyright and fix Pillow 10.3 typing issues 2024-04-02 18:15:52 +02:00