Skip to content

Wrap Flux's call with diffusion_model wrappers similar to the Unet model #7382

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

kabachuha
Copy link

Hi!

I'm currently trying to mod Flux and I found out comfy doesn't have the diffusion model patcher wrapper like the vanilla Unet model does in comfy/ldm/modules/openaimodel.py. The job of this hook is to wrap the diffusion model's call with activatable hooks, which can be numerous.

This makes the modding process, like wrapping the Flux with Reference-only controlnet hooks, very hard without big structural changes.

I think this important, yet small addition of the hook will benefit the extension building community a lot, while not breaking the compatibility.

Additionally, I would like to have the program passing transformer options into the Single Stream and the Double Stream blocks, so the modded information (like reference images) could reach their patched versions without rewriting the whole Flux's forward method. This also will be consistent with how BasicTransformerBlock works in comfy/ldm/modules/attention for vanilla models.

@Slickytail
Copy link
Contributor

This would be very helpful to anyone modding flux.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants