Commit graph

11 commits

Author SHA1 Message Date
Pierre Chapuis 471ef91d1c make __getattr__ on Module return object, not Any
PyTorch chose to make it Any because they expect its users' code
to be "highly dynamic": https://github.com/pytorch/pytorch/pull/104321

It is not the case for us, in Refiners having untyped code
goes contrary to one of our core principles.

Note that there is currently an open PR in PyTorch to
return `Module | Tensor`, but in practice this is not always
correct either: https://github.com/pytorch/pytorch/pull/115074

I also moved Residuals-related code from SD1 to latent_diffusion
because SDXL should not depend on SD1.
2024-02-06 11:32:18 +01:00
limiteinductive 73f6ccfc98 make Scheduler a fl.Module + Change name Scheduler -> Solver 2024-01-31 17:03:52 +01:00
limiteinductive 20c229903f upgrade pyright to 1.1.342 ; improve no_grad typing 2023-12-29 15:09:02 +01:00
Bryce 5ca1549c96 refactor: convert bash script to python
Ran successfully to completion. But on a repeat run `convert_unclip` didn't pass the hash check for some reason.

- fix inpainting model download urls
- shows a progress bar for downloads
- skips downloading existing files
- uses a temporary file to prevent partial downloads
- can do a dry run to check if url is valid `DRY_RUN=1 python scripts/prepare_test_weights.py`
- displays the downloaded file hash
2023-12-15 09:55:59 +01:00
Cédric Deltheil 792a0fc3d9 run lint rules using latest isort settings 2023-12-11 11:58:43 +01:00
Benjamin Trom ea44262a39 unnest Residual subchain by modifying its forward
And replaced the remaining Sum-Identity layers by Residual.

The tolerance used to compare SAM's ViT models has been tweaked: for
some reasons there is a small difference (in float32) in the neck layer
(first conv2D)

Co-authored-by: Cédric Deltheil <cedric@deltheil.me>
2023-10-19 10:34:51 +02:00
limiteinductive 88efa117bf fix model comparison with custom layers 2023-09-05 12:34:38 +02:00
Cédric Deltheil b933fabf31 unet: get rid of clip_embedding attribute for SD1
It is implicitly defined by the underlying cross-attention layer. This
also makes it consistent with SDXL.
2023-09-01 19:23:33 +02:00
Pierre Chapuis 0f476ea18b make high-level adapters Adapters
This generalizes the Adapter abstraction to higher-level
constructs such as high-level LoRA (targeting e.g. the
SD UNet), ControlNet and Reference-Only Control.

Some adapters now work by adapting child models with
"sub-adapters" that they inject / eject when needed.
2023-08-31 10:57:18 +02:00
Pierre Chapuis 18c84c7b72 shorter import paths 2023-08-29 16:57:40 +02:00
limiteinductive 7ca6bd0ccd implement the ConvertModule class and refactor conversion scripts 2023-08-28 14:39:14 +02:00
Renamed from scripts/convert-controlnet-weights.py (Browse further)