Commit graph

686 commits

Author SHA1 Message Date
Pierre Chapuis 337d2aea58 cosmetics 2023-08-23 17:49:59 +02:00
Pierre Chapuis 16618d73de remove useless uses of type: ignore 2023-08-23 17:49:59 +02:00
Pierre Chapuis 1065dfe10b add empty __init__.py files to make pytest happy
(otherwise it wants unique file basenames)
2023-08-23 17:49:59 +02:00
Pierre Chapuis a0c70ba7aa add a test for StopIteration in walk 2023-08-23 12:15:56 +02:00
Pierre Chapuis dec0d64432 make walk and layers not recurse by default
There is now a parameter to get the old (recursive) behavior.
2023-08-23 12:15:56 +02:00
Pierre Chapuis 2ad26a06b0 fix LoRAs on Self target 2023-08-23 12:13:01 +02:00
limiteinductive 3565a4127f implement DoubleTextEncoder for SDXL 2023-08-23 11:05:38 +02:00
Cédric Deltheil 71ddb55a8e infer device and dtype in LoraAdapter 2023-08-22 11:55:39 +02:00
Benjamin Trom 8c7298f8cc fix chain slicing with structural copy 2023-08-22 11:44:11 +02:00
limiteinductive e7c1db50e0 turn CLIPTokenizer into a fl.Module 2023-08-22 00:09:01 +02:00
Cédric Deltheil 1ad4e1a35a converter: add missing structural_attrs 2023-08-21 16:04:12 +02:00
Cédric Deltheil b91a457495 use Converter layer for sinuosoidal embedding 2023-08-21 16:04:12 +02:00
limiteinductive 108fa8f26a add converter layer + tests 2023-08-21 12:09:58 +02:00
limiteinductive 4526d58cd5 update CTOR of CLIPTextEncoder with max_sequence_length 2023-08-21 11:21:12 +02:00
limiteinductive 6fd5894caf split PositionalTokenEncoder 2023-08-21 11:21:12 +02:00
limiteinductive 9d663534d1 cosmetic changes for text_encoder.py 2023-08-21 11:21:12 +02:00
limiteinductive b8e7179447 make clip g use quick gelu and pad_token_id 0 2023-08-17 17:31:15 +02:00
limiteinductive 6594502c11 parametrize tokenizer for text_encoder 2023-08-17 17:31:15 +02:00
limiteinductive 4575e3dd91 add start, end and pad tokens as parameter 2023-08-17 17:31:15 +02:00
limiteinductive 63fda2bfd8 add use_quick_gelu kwarg for CLIPTextEncoder 2023-08-17 17:31:15 +02:00
limiteinductive efe923a272 cosmetic changes 2023-08-17 17:31:15 +02:00
limiteinductive 17dc75421b make basic layers an enum and work with subtyping 2023-08-17 15:36:43 +02:00
limiteinductive 9da00e6fcf fix typing for informative drawings convert script 2023-08-17 14:44:45 +02:00
Pierre Chapuis 0fd46f9ec4 make type checking strict 2023-08-17 14:44:45 +02:00
Benjamin Trom 663d7c414e Update pyproject.toml
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2023-08-17 14:44:45 +02:00
Benjamin Trom 2ee094c18c Update scripts/convert-lora-weights.py
Co-authored-by: Cédric Deltheil <355031+deltheil@users.noreply.github.com>
2023-08-17 14:44:45 +02:00
limiteinductive c9fba44f39 fix typing for scripts 2023-08-17 14:44:45 +02:00
limiteinductive 89224c1e75 activate typing for all scripts 2023-08-17 14:44:45 +02:00
Pierre Chapuis 97b162d9a0 add InformativeDrawings
https://github.com/carolineec/informative-drawings

This is the preprocessor for the Lineart ControlNet.
2023-08-16 12:29:09 +02:00
Pierre Chapuis e10f761a84 GroupNorm and LayerNorm must be affine to be WeightedModules 2023-08-16 12:29:09 +02:00
Cédric Deltheil 32425016c8 add missing torchvision dependency for training
Via:

    poetry add --optional torchvision
    poetry lock
2023-08-08 15:53:35 +02:00
Pierre Chapuis bd49304fc8 add Sigmoid activation 2023-08-07 19:56:28 +02:00
Cédric Deltheil f49bb4f5fd pyproject.toml: bump pydantic to 2.0.3 2023-08-04 19:44:56 +02:00
Cédric Deltheil a307da1983
update README.md
Change image links
2023-08-04 19:18:07 +02:00
Pierre Chapuis 84cde77825 add documentation about Adapters 2023-08-04 18:49:11 +02:00
Cédric Deltheil 48f674c433 initial commit 2023-08-04 15:28:41 +02:00