Commit graph

199 commits

Author SHA1 Message Date
Cédric Deltheil 4fc5e427b8 training_utils: fix extra detection
Requirements could be, e.g.:

    wandb (>=0.15.7,<0.16.0) ; extra == "training"

Or:

    wandb>=0.16.0; extra == 'training'

Follow up of 86c5497
2023-12-08 19:09:16 +01:00
limiteinductive 86c54977b9 replace poetry by rye for python dependency management
Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
Co-authored-by: Pierre Chapuis <git@catwell.info>
2023-12-08 17:40:10 +01:00
limiteinductive 807ef5551c refactor fl.Parameter basic layer
Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
2023-12-08 10:20:34 +01:00
Cédric Deltheil 11422a3faf fix typo in finetune-ldm config 2023-12-07 18:19:45 +01:00
Cédric Deltheil 46b4b4b462 training_utils: fix naming issue timestep->step 2023-12-05 10:05:34 +01:00
limiteinductive 0dc3a17fbf remove unnecessary test 2023-12-04 15:27:06 +01:00
limiteinductive 8bbcf2048d add README bullet point 2023-12-04 15:27:06 +01:00
limiteinductive 37a74bd549 format test_scheduler file 2023-12-04 15:27:06 +01:00
limiteinductive 90db6ef59d add e2e test for sd15 with karras noise schedule 2023-12-04 15:27:06 +01:00
limiteinductive 6f110ee2b2 fix test_scheduler_utils 2023-12-04 15:27:06 +01:00
limiteinductive 1075ea4a62 fix ddpm and ddim __init__ 2023-12-04 15:27:06 +01:00
limiteinductive ad8f02e555 add karras sampling to the Scheduler abstract class, default is quadratic 2023-12-04 15:27:06 +01:00
Pierre Chapuis f22f969d65 remove Black preview mode
also fix multiline logs in training
2023-12-04 14:15:56 +01:00
Bryce 4176868e79 feature: add sliced-attention for memory efficiency
This allowed me to produce HD images on M1 32gb and 7000x5000 on Nvidia 4090

I saw no visual difference in images generated.

Some datapoints on slice_size
# 4096 max needed for SD 1.5 512x512
# 9216 max needed for SD 1.5 768x768
# 16384 max needed for SD 1.5 1024x1024
# 32400 max needed for SD 1.5 1920x1080 (HD)
# 129600 max needed for SD 1.5 3840x2160 (4k)
# 234375 max needed for SD 1.5 5000x3000
2023-12-01 15:30:23 +01:00
Cédric Deltheil b306c7db1b freeu: add one more test for identity scales
It should act as a NOP when [1.0, 1.0] is used for backbone and skip
scales.
2023-12-01 12:48:19 +01:00
Cédric Deltheil 761678d9a5 README: add yet another badge for discord 2023-11-30 17:33:32 +01:00
Cédric Deltheil 01cf4efba2 README: add code bounties badge 2023-11-28 11:41:11 +01:00
Cédric Deltheil cbb13ed032 README: add a link to imaginAIry 2023-11-24 12:18:55 +01:00
Cédric Deltheil dde47318da README: add a link to the intro blog post 2023-11-21 11:19:23 +01:00
Benjamin Trom 2d4c4774f4 add maxpool to refiners layer 2023-11-20 10:58:53 +01:00
Bryce f666bc82f5 feature: support self-attention guidance with SD1 inpainting model 2023-11-20 10:17:15 +01:00
Cédric Deltheil ab0915d052 add tests for FreeU 2023-11-18 16:15:44 +01:00
Benjamin Trom 6eeb01137d Add Adapter in refiners.fluxion.adapters init 2023-11-18 13:54:40 +01:00
Cédric Deltheil 86e7dfe0c7 add FreeU to latest news 2023-11-17 18:26:11 +01:00
isamu-isozaki 770879a6df Free U 2023-11-17 17:22:20 +01:00
Bryce 92e8166c83 docs: fix markdown link to lora adapter code 2023-10-28 22:42:17 +02:00
Pierre Chapuis 02f3c46e2e update pyright 2023-10-25 14:56:07 +02:00
Cédric Deltheil fc71e900a0 black 2023-10-21 13:51:06 +02:00
Cédric Deltheil e70dee987e use segment-anything unofficial Python package
Via:

    poetry remove segment-anything
    poetry add --optional segment-anything-py==1.0

Needed to publish on PyPI otherwise it fails (error: Can't have direct
dependency)
2023-10-21 13:51:06 +02:00
Cédric Deltheil e74454473d bump library version to v0.2.0 2023-10-20 18:28:31 +02:00
Cédric Deltheil 5d19d14e51 README: upgrade hello world 2023-10-20 18:28:31 +02:00
Cédric Deltheil 3f54494e04 README: add latest news section 2023-10-19 17:48:01 +02:00
Benjamin Trom ea44262a39 unnest Residual subchain by modifying its forward
And replaced the remaining Sum-Identity layers by Residual.

The tolerance used to compare SAM's ViT models has been tweaked: for
some reasons there is a small difference (in float32) in the neck layer
(first conv2D)

Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
2023-10-19 10:34:51 +02:00
Cédric Deltheil 46dd710076 test_converter: use proper exception type
Follow up of #102
2023-10-18 14:39:24 +02:00
Benjamin Trom 6ddd901767 improve image_to_tensor and tensor_to_image utils 2023-10-17 18:08:58 +02:00
limiteinductive 585c7ad55a improve consistency of the dpm scheduler 2023-10-12 15:48:43 +02:00
limiteinductive 7a62049d54 implement Restart method for latent diffusion 2023-10-12 15:48:43 +02:00
Cédric Deltheil e35dce825f pyproject.toml: add a note about scipy + bitsandbytes 2023-10-11 15:43:04 +02:00
Cédric Deltheil ac631bfb2e make self attention guidance idempotent
Follow up of d3365d6
2023-10-11 10:47:22 +02:00
Benjamin Trom 0024191c58 improve debug print for chains 2023-10-10 15:25:09 +02:00
Benjamin Trom a663375dc7 prevent setattr pytorch module to register on the Chain class 2023-10-10 14:46:15 +02:00
Cédric Deltheil d02be0d10e tests: update ref image for SDXL IP-Adapter plus
Note: https://pytorch.org/docs/stable/notes/randomness.html

> Completely reproducible results are not guaranteed across PyTorch
> releases [...]
2023-10-10 14:19:47 +02:00
Cédric Deltheil 455be5a4be remove TODO related to older pyright version 2023-10-10 14:19:47 +02:00
Cédric Deltheil 5aa8e11eb7 upgraded pyright to 1.1.330.post0
Via:

    poetry update
2023-10-10 14:19:47 +02:00
Cédric Deltheil 5158187e96 poetry add torch@^2.1.0 2023-10-10 14:19:47 +02:00
Cédric Deltheil 9f3d064d14 poetry add --optional torchvision@^0.16.0 2023-10-10 14:19:47 +02:00
Cédric Deltheil b80769939d add support for self-attention guidance
See https://arxiv.org/abs/2210.00939
2023-10-09 17:33:15 +02:00
Pierre Chapuis 976b55aea5 add test weights conversion script 2023-10-09 14:18:40 +02:00
Cédric Deltheil 05126c8f4d make gaussian_blur work with float16 2023-10-07 21:48:38 +02:00
Cédric Deltheil 7d2abf6fbc scheduler: add remove noise
aka original sample prediction (or predict x0)

E.g. useful for methods like self-attention guidance (see equation (2)
in https://arxiv.org/pdf/2210.00939.pdf)
2023-10-05 17:05:15 +02:00