limiteinductive
cef8a9936c
refactor register_model decorator
2024-02-12 16:21:04 +01:00
limiteinductive
d6546c9026
add @register_model and @register_callback decorators
...
Refactor ClockTrainer to include Callback
2024-02-12 10:24:19 +01:00
limiteinductive
f541badcb3
Allow optional train ModelConfig + forbid extra input for configs
2024-02-10 16:13:10 +01:00
Pierre Chapuis
402d3105b4
support multiple IP adapter inputs as tensor
2024-02-09 17:16:17 +01:00
Cédric Deltheil
5a7085bb3a
training_utils/config.py: inline type alias
...
Follow up of #227
2024-02-09 14:36:22 +01:00
Laurent
d590c0e2fa
add typos
to dev-dependencies, also remove ruff
from non dev-dependencies
2024-02-09 12:12:51 +01:00
Pierre Colle
25bfa78907
lr, betas, eps, weight_decay at model level
...
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2024-02-09 12:05:13 +01:00
Cédric Deltheil
9aefc9896c
test_trainer: use model_copy
instead of copy
...
The `copy` method has been deprecated.
2024-02-08 20:07:34 +01:00
Colle
f4aa0271b8
less than 1 epoch training duration
2024-02-08 19:20:31 +01:00
limiteinductive
41508e0865
change param name of abstract get_item method
2024-02-08 18:52:52 +01:00
Laurent
6d599d53fd
beautify EXPECTED_TREE
in test_chain.py
2024-02-08 15:09:47 +01:00
Cédric Deltheil
e36dda63fd
fix miscellaneous typos
2024-02-07 17:51:25 +01:00
Cédric Deltheil
d7aadf99de
add spelling.yml to spot spelling mistakes
...
See https://github.com/crate-ci/typos/blob/master/docs/github-action.md
for details
2024-02-07 17:51:25 +01:00
Pierre Chapuis
396d166564
make pad method private
2024-02-07 17:47:14 +01:00
Pierre Chapuis
4d85918336
Update src/refiners/foundationals/latent_diffusion/lora.py
...
Co-authored-by: Laureηt <laurent@lagon.tech>
2024-02-07 17:47:14 +01:00
Pierre Chapuis
b1c200c63a
Update src/refiners/foundationals/latent_diffusion/lora.py
...
Co-authored-by: Laureηt <laurent@lagon.tech>
2024-02-07 17:47:14 +01:00
Pierre Chapuis
eb9abefe07
add a few comments in SDLoraManager
2024-02-07 17:47:14 +01:00
Benjamin Trom
bbe0759151
fix docstring
2024-02-07 16:13:01 +01:00
limiteinductive
2e526d35d1
Make Dataset part of the trainer
2024-02-07 16:13:01 +01:00
Laurent
9883f24f9a
(fluxion/layers) remove View
layer
...
+ replace existing `View` layers by `Reshape`
2024-02-07 12:06:07 +01:00
limiteinductive
2ef4982e04
remove wandb from base config
2024-02-07 11:06:59 +01:00
Pierre Chapuis
11da76f7df
fix sdxl structural copy
2024-02-07 10:51:26 +01:00
Pierre Chapuis
ca9e89b22a
cosmetics
2024-02-07 10:51:26 +01:00
limiteinductive
ea05f3d327
make device and dtype work in Trainer class
2024-02-06 23:10:10 +01:00
Hugues Pouillot
1fa5266f56
fix github actions triggers
2024-02-06 16:48:18 +01:00
Cédric Deltheil
907a6becbc
bump library version to v0.3.1
2024-02-06 15:22:15 +01:00
Pierre Chapuis
98fce82853
fix 37425fb609
...
Things to understand:
- subscripted generic basic types (e.g. `list[int]`) are types.GenericAlias;
- subscripted generic classes are `typing._GenericAlias`;
- neither can be used with `isinstance()`;
- get_origin is the cleanest way to check for this.
2024-02-06 13:49:37 +01:00
Cédric Deltheil
f9305aa416
pyproject.toml: add some PyPI classifiers
...
In particuler, the Python versions (3.10, etc) used to be included with
builds created with Poetry (they got removed after the switch to Rye:
see #141 ). This commit should fix the broken pypi badge.
2024-02-06 11:40:01 +01:00
Pierre Chapuis
37425fb609
make LoRA generic
2024-02-06 11:32:18 +01:00
Pierre Chapuis
471ef91d1c
make __getattr__
on Module return object, not Any
...
PyTorch chose to make it Any because they expect its users' code
to be "highly dynamic": https://github.com/pytorch/pytorch/pull/104321
It is not the case for us, in Refiners having untyped code
goes contrary to one of our core principles.
Note that there is currently an open PR in PyTorch to
return `Module | Tensor`, but in practice this is not always
correct either: https://github.com/pytorch/pytorch/pull/115074
I also moved Residuals-related code from SD1 to latent_diffusion
because SDXL should not depend on SD1.
2024-02-06 11:32:18 +01:00
Cédric Deltheil
ec401133f1
bump library version to v0.3.0
2024-02-05 10:27:22 +01:00
Pierre Chapuis
3de1508b65
increase tolerance on Euler test
2024-02-04 08:58:22 +01:00
Pierre Chapuis
83b478c0ff
fix test failure caused by Diffusers 0.26.0
2024-02-04 08:58:22 +01:00
Laurent
8d190e4256
(fluxion/layers/activations) replace ApproximateGeLU
by GeLUApproximation
2024-02-02 19:41:18 +01:00
hugojarkoff
2bdb42e88d
Change image preprocessing resizing to use Pillow
2024-02-02 18:21:04 +01:00
hugojarkoff
75830e2179
Add T2I-Adapter subsection to SDXL Adaptation guide
2024-02-02 18:21:04 +01:00
Pierre Chapuis
fbb1fcb8ff
Chain#pop does not return tuples
2024-02-02 18:11:51 +01:00
Cédric Deltheil
a779f86941
README: add link to Discussions
2024-02-02 17:58:04 +01:00
Laurent
1dcb36e1e0
(doc/foundationals) add IPAdapter
, related docstrings
2024-02-02 17:35:03 +01:00
Laurent
6b35f1cc84
(doc/foundationals) add SDLoraManager
, related docstrings
2024-02-02 17:35:03 +01:00
Laurent
7406d8e01f
(mkdocs) fix cross-reference typo
2024-02-02 17:35:03 +01:00
Laurent
093527a7de
apply @deltheil suggestions
2024-02-02 17:35:03 +01:00
Laurent
f62e71da1c
(doc/foundationals) add SegmentAnything
, related docstrings
2024-02-02 17:35:03 +01:00
Laurent
a926696141
(doc/foundationals) add CLIP
, related docstrings
2024-02-02 17:35:03 +01:00
Laurent
3910845e29
(doc/foundationals) add DINOv2
, related docstrings
2024-02-02 17:35:03 +01:00
Laurent
fc7b4dd62d
(doc/fluxion/ld) add DDPM
, DDIM
, DPM++
and Euleur
docstrings
2024-02-02 17:35:03 +01:00
Laurent
6d8016190c
(docs) modify which docstrings are displayed in docs/reference/latent_diffusion.md
2024-02-02 17:35:03 +01:00
Laurent
a1a00998ea
(doc/fluxion/ld) add StableDiffusion_1
docstrings
2024-02-02 17:35:03 +01:00
Laurent
f2bcb7f45e
(mkdocstrings) export SDXLAutoencoder
in src/refiners/foundationals/latent_diffusion/stable_diffusion_xl/__init__.py
2024-02-02 17:35:03 +01:00
Laurent
2a7b86ac02
(doc/fluxion/ld) add LatentDiffusionAutoencoder
docstrings
2024-02-02 17:35:03 +01:00