Skip to content

Commit

Permalink
Merge branch 'master'
Browse files Browse the repository at this point in the history
  • Loading branch information
zeiss-dhaase committed Apr 8, 2020
2 parents caf7e1a + 4f63bd2 commit fed2bf1
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions bsconv/pytorch/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ BSConv PyTorch Modules

We provide two PyTorch modules `bsconv.pytorch.BSConvU` (unconstrained BSConv) and `bsconv.pytorch.BSConvS` (subspace BSConv) which can be used instead of `torch.nn.Conv2d` layers.

### Example 1: Building a simple custom model with unconstrained BSConvU modules:
### Example 1: Building a simple custom model with unconstrained BSConv-U modules:

```python
import torch
Expand Down Expand Up @@ -146,7 +146,7 @@ class SimpleNet(torch.nn.Module):
return x
```

### Example 2: Building a simple custom model with subspace BSConvS modules:
### Example 2: Building a simple custom model with subspace BSConv-S modules:

To easily apply the orthonormal regularization loss to each module, the model has to be derived as usual from `torch.nn.Module` but also from the mixin class `bsconv.pytorch.BSConvS_ModelRegLossMixin`.

Expand All @@ -159,8 +159,8 @@ class SimpleNet(torch.nn.Module, bsconv.pytorch.BSConvS_ModelRegLossMixin):
def __init__(self, num_classes=1000):
super().__init__()
self.features = torch.nn.Sequential(
# using a BSConvU module as the first conv layer,
# since compressing a 3 channel input with BSConvS would be overkill
# using a BSConv-U module as the first conv layer,
# since compressing a 3 channel input with BSConv-S would be overkill
bsconv.pytorch.BSConvU(3, 32, kernel_size=3, stride=2, padding=1),
torch.nn.BatchNorm2d(num_features=32),
torch.nn.ReLU(inplace=True),
Expand Down

0 comments on commit fed2bf1

Please sign in to comment.