Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Questions about mask usage in convolution #13

Open
antao97 opened this issue Aug 12, 2022 · 0 comments
Open

Questions about mask usage in convolution #13

antao97 opened this issue Aug 12, 2022 · 0 comments

Comments

@antao97
Copy link

antao97 commented Aug 12, 2022

Hi,

Thanks for your great work!

I have some questions about the mask usage in your convolution operation. I'm wondering what is the meaning to assign conv_module.__mask__ with mask. I checked that the conv_module(x) function does not consider the conv_module.__mask__ property when operating.

def conv1x1(conv_module, x, mask, fast=False):
w = conv_module.weight.data
mask.flops_per_position += w.shape[0]*w.shape[1]
conv_module.__mask__ = mask
return conv_module(x)

Therefore, I can't get how the masks are applied in network forward propagation, such as the basicblock in

x = dynconv.conv3x3(self.conv1, x, None, mask_dilate)
x = dynconv.bn_relu(self.bn1, self.relu, x, mask_dilate)
x = dynconv.conv3x3(self.conv2, x, mask_dilate, mask)
x = dynconv.bn_relu(self.bn2, None, x, mask)
out = identity + dynconv.apply_mask(x, mask)

It seems that only the mask in dynconv.apply_mask(x, mask) works.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant