refiners/tests
Pierre Chapuis 471ef91d1c make __getattr__ on Module return object, not Any
PyTorch chose to make it Any because they expect its users' code
to be "highly dynamic": https://github.com/pytorch/pytorch/pull/104321

It is not the case for us, in Refiners having untyped code
goes contrary to one of our core principles.

Note that there is currently an open PR in PyTorch to
return `Module | Tensor`, but in practice this is not always
correct either: https://github.com/pytorch/pytorch/pull/115074

I also moved Residuals-related code from SD1 to latent_diffusion
because SDXL should not depend on SD1.
2024-02-06 11:32:18 +01:00
..
adapters make __getattr__ on Module return object, not Any 2024-02-06 11:32:18 +01:00
e2e clip text, lda encode batch inputs 2024-02-01 17:05:28 +01:00
fluxion make __getattr__ on Module return object, not Any 2024-02-06 11:32:18 +01:00
foundationals make __getattr__ on Module return object, not Any 2024-02-06 11:32:18 +01:00
training_utils add support for pytorch 2.2 (2.1 is still supported) 2024-01-31 15:03:06 +01:00
__init__.py initial commit 2023-08-04 15:28:41 +02:00
conftest.py run lint rules using latest isort settings 2023-12-11 11:58:43 +01:00
utils.py run lint rules using latest isort settings 2023-12-11 11:58:43 +01:00