Commit graph

710 commits

Author SHA1 Message Date
limiteinductive 7d9ceae274 change default behavior of end to None 2023-12-13 17:03:28 +01:00
Cédric Deltheil 82a2aa1ec4 deprecate DDPM step which is unused for now 2023-12-13 15:51:42 +01:00
limiteinductive a7551e0392 Change fl.Slicing API 2023-12-13 09:38:13 +01:00
Cédric Deltheil 11b0ff6f8c ddim: remove unused attribute 2023-12-12 17:26:14 +01:00
Cédric Deltheil 315b4ed2e4 test_schedulers: enforce manual seed 2023-12-12 17:26:14 +01:00
limiteinductive 7992258dd2 add before/after init callback to trainer 2023-12-12 10:22:55 +01:00
Pierre Chapuis 42a0fc4aa0 fix circular imports 2023-12-11 15:27:11 +01:00
Cédric Deltheil c8d5faff9b pyproject.toml: add example for combine-as-imports
b44d612 was misleading (s/avoid/allow/)
2023-12-11 13:57:57 +01:00
Cédric Deltheil 792a0fc3d9 run lint rules using latest isort settings 2023-12-11 11:58:43 +01:00
Cédric Deltheil b44d6122c4 pyproject.toml: enable isort rule in Ruff
Use `combine-as-imports = true` [1] to avoid such kinds or imports on a
single line:

    from torch import Tensor, device as Device, dtype as DType

[1]: https://docs.astral.sh/ruff/settings/#isort-combine-as-imports
2023-12-11 11:58:43 +01:00
Cédric Deltheil 4c07225d68 pyproject.toml: remove tool.isort section
Ruff is going to be used instead.

Follow up of #141
2023-12-11 11:58:43 +01:00
Cédric Deltheil 4fc5e427b8 training_utils: fix extra detection
Requirements could be, e.g.:

    wandb (>=0.15.7,<0.16.0) ; extra == "training"

Or:

    wandb>=0.16.0; extra == 'training'

Follow up of 86c5497
2023-12-08 19:09:16 +01:00
limiteinductive 86c54977b9 replace poetry by rye for python dependency management
Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
Co-authored-by: Pierre Chapuis <git@catwell.info>
2023-12-08 17:40:10 +01:00
limiteinductive 807ef5551c refactor fl.Parameter basic layer
Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
2023-12-08 10:20:34 +01:00
Cédric Deltheil 11422a3faf fix typo in finetune-ldm config 2023-12-07 18:19:45 +01:00
Cédric Deltheil 46b4b4b462 training_utils: fix naming issue timestep->step 2023-12-05 10:05:34 +01:00
limiteinductive 0dc3a17fbf remove unnecessary test 2023-12-04 15:27:06 +01:00
limiteinductive 8bbcf2048d add README bullet point 2023-12-04 15:27:06 +01:00
limiteinductive 37a74bd549 format test_scheduler file 2023-12-04 15:27:06 +01:00
limiteinductive 90db6ef59d add e2e test for sd15 with karras noise schedule 2023-12-04 15:27:06 +01:00
limiteinductive 6f110ee2b2 fix test_scheduler_utils 2023-12-04 15:27:06 +01:00
limiteinductive 1075ea4a62 fix ddpm and ddim __init__ 2023-12-04 15:27:06 +01:00
limiteinductive ad8f02e555 add karras sampling to the Scheduler abstract class, default is quadratic 2023-12-04 15:27:06 +01:00
Pierre Chapuis f22f969d65 remove Black preview mode
also fix multiline logs in training
2023-12-04 14:15:56 +01:00
Bryce 4176868e79 feature: add sliced-attention for memory efficiency
This allowed me to produce HD images on M1 32gb and 7000x5000 on Nvidia 4090

I saw no visual difference in images generated.

Some datapoints on slice_size
# 4096 max needed for SD 1.5 512x512
# 9216 max needed for SD 1.5 768x768
# 16384 max needed for SD 1.5 1024x1024
# 32400 max needed for SD 1.5 1920x1080 (HD)
# 129600 max needed for SD 1.5 3840x2160 (4k)
# 234375 max needed for SD 1.5 5000x3000
2023-12-01 15:30:23 +01:00
Cédric Deltheil b306c7db1b freeu: add one more test for identity scales
It should act as a NOP when [1.0, 1.0] is used for backbone and skip
scales.
2023-12-01 12:48:19 +01:00
Cédric Deltheil 761678d9a5 README: add yet another badge for discord 2023-11-30 17:33:32 +01:00
Cédric Deltheil 01cf4efba2 README: add code bounties badge 2023-11-28 11:41:11 +01:00
Cédric Deltheil cbb13ed032 README: add a link to imaginAIry 2023-11-24 12:18:55 +01:00
Cédric Deltheil dde47318da README: add a link to the intro blog post 2023-11-21 11:19:23 +01:00
Benjamin Trom 2d4c4774f4 add maxpool to refiners layer 2023-11-20 10:58:53 +01:00
Bryce f666bc82f5 feature: support self-attention guidance with SD1 inpainting model 2023-11-20 10:17:15 +01:00
Cédric Deltheil ab0915d052 add tests for FreeU 2023-11-18 16:15:44 +01:00
Benjamin Trom 6eeb01137d Add Adapter in refiners.fluxion.adapters init 2023-11-18 13:54:40 +01:00
Cédric Deltheil 86e7dfe0c7 add FreeU to latest news 2023-11-17 18:26:11 +01:00
isamu-isozaki 770879a6df Free U 2023-11-17 17:22:20 +01:00
Bryce 92e8166c83 docs: fix markdown link to lora adapter code 2023-10-28 22:42:17 +02:00
Pierre Chapuis 02f3c46e2e update pyright 2023-10-25 14:56:07 +02:00
Cédric Deltheil fc71e900a0 black 2023-10-21 13:51:06 +02:00
Cédric Deltheil e70dee987e use segment-anything unofficial Python package
Via:

    poetry remove segment-anything
    poetry add --optional segment-anything-py==1.0

Needed to publish on PyPI otherwise it fails (error: Can't have direct
dependency)
2023-10-21 13:51:06 +02:00
Cédric Deltheil e74454473d bump library version to v0.2.0 2023-10-20 18:28:31 +02:00
Cédric Deltheil 5d19d14e51 README: upgrade hello world 2023-10-20 18:28:31 +02:00
Cédric Deltheil 3f54494e04 README: add latest news section 2023-10-19 17:48:01 +02:00
Benjamin Trom ea44262a39 unnest Residual subchain by modifying its forward
And replaced the remaining Sum-Identity layers by Residual.

The tolerance used to compare SAM's ViT models has been tweaked: for
some reasons there is a small difference (in float32) in the neck layer
(first conv2D)

Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
2023-10-19 10:34:51 +02:00
Cédric Deltheil 46dd710076 test_converter: use proper exception type
Follow up of #102
2023-10-18 14:39:24 +02:00
Benjamin Trom 6ddd901767 improve image_to_tensor and tensor_to_image utils 2023-10-17 18:08:58 +02:00
limiteinductive 585c7ad55a improve consistency of the dpm scheduler 2023-10-12 15:48:43 +02:00
limiteinductive 7a62049d54 implement Restart method for latent diffusion 2023-10-12 15:48:43 +02:00
Cédric Deltheil e35dce825f pyproject.toml: add a note about scipy + bitsandbytes 2023-10-11 15:43:04 +02:00
Cédric Deltheil ac631bfb2e make self attention guidance idempotent
Follow up of d3365d6
2023-10-11 10:47:22 +02:00