-
Notifications
You must be signed in to change notification settings - Fork 138
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
a problem about the code,thanks #30
Comments
I have the same query The first for loop modifies the following blocks:
The second for loop modifies:
Which is a subset of the first for loop. according to the comment, the first block of the lowest resolution shouldn't have extended attention registered. the first for loop registers extended attention for that block as well. |
同问 |
I think the valid function should be Lines 203 to 214 in 8ae24e9
The injection is activated according to Lines 124 to 130 in 8ae24e9
Lines 86 to 91 in 8ae24e9
BTW, I tried removing the first loop in L203-L206 and found the result was not changed. However, when removing the second loop in L208-L214, the result would get worse. |
it seems that you change all the basictransformerblock in both down_blocks, mid_blocks and up_blocks. why still change the up_blocks in the unet again?
The text was updated successfully, but these errors were encountered: