A microframework on top of PyTorch with first-class citizen APIs for foundation model adaptation https://refine.rs/
Find a file
Pierre Chapuis daee77298d improve FrankenSolver
It now takes a Scheduler factory instead of a Scheduler.
This lets the user potentially recreate the Scheduler on `rebuild`.

It also properly sets the device and dtype on rebuild,
and it has better typing.
2024-07-19 16:46:52 +02:00
.github/workflows bump setup-rye action to v3 2024-06-27 14:11:07 +02:00
assets README: upgrade hello world 2023-10-20 18:28:31 +02:00
docs add refiners.foundationals.latent_diffusion.multi_diffusion to mkdocstrings 2024-07-11 15:23:02 +02:00
notebooks deprecate outdated notebooks/basics.ipynb 2024-02-02 14:12:59 +01:00
scripts Add ControlNet Tile e2e test 2024-06-25 09:27:08 +02:00
src/refiners improve FrankenSolver 2024-07-19 16:46:52 +02:00
tests improve FrankenSolver 2024-07-19 16:46:52 +02:00
.gitignore add DINOv2-FD metric 2024-04-03 16:45:00 +02:00
CONTRIBUTING.md refactor dinov2 tests, check against official implementation 2024-04-02 10:02:43 +02:00
LICENSE Update LICENSE 2024-02-02 14:08:09 +01:00
mkdocs.yml write Training 101 guide 2024-02-26 14:44:02 +01:00
pyproject.toml bump pillow and pyright 2024-07-17 19:22:16 +02:00
README.md update latest news 2024-07-16 21:27:53 +02:00
requirements.docs.txt (pyproject.toml) move doc deps inside their own project.optional-dependencies 2024-02-02 11:08:21 +01:00
requirements.lock bump pillow and pyright 2024-07-17 19:22:16 +02:00

Finegrain Refiners Library

The simplest way to train and run adapters on top of foundation models

Manifesto | Docs | Guides | Discussions | Discord


PyPI - Python Version PyPI Status license code bounties chat

Latest News 🔥

Installation

The current recommended way to install Refiners is from source using Rye:

git clone "git@github.com:finegrain-ai/refiners.git"
cd refiners
rye sync --all-features

Documentation

Refiners comes with a MkDocs-based documentation website available at https://refine.rs. You will find there a quick start guide, a description of the key concepts, as well as in-depth foundation model adaptation guides.

Awesome Adaptation Papers

If you're interested in understanding the diversity of use cases for foundation model adaptation (potentially beyond the specific adapters supported by Refiners), we suggest you take a look at these outstanding papers:

Projects using Refiners

Credits

We took inspiration from these great projects:

Citation

@misc{the-finegrain-team-2023-refiners,
  author = {Benjamin Trom and Pierre Chapuis and Cédric Deltheil},
  title = {Refiners: The simplest way to train and run adapters on top of foundation models},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/finegrain-ai/refiners}}
}