Skip to content

iantimmis/DropBlock-Unofficial-Implementation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DropBlock Unofficial Implementation

This project contains an unofficial set of implementations of "DropBlock: A regularization method for convolutional networks" from Google Brain. DropBlock is a variant of dropout that removes contiguous regions from feature maps instead of individual activations. These layers can be used to regularize convolutional networks across multiple machine learning frameworks.

Requirements

DropBlock itself only depends on NumPy. You will also need at least one of the supported deep learning frameworks if you want to run the examples or tests.

Install the packages you require, e.g.:

pip install numpy tensorflow torch jax

About

The paper proposes dropping spatial blocks of activations during training so that nearby units cannot simply co-adapt. In practice a mask is sampled with a probability gamma and expanded into square regions of zeros. This has been shown to improve generalization on several vision benchmarks.

This repository is provided for educational purposes and is not an official release from the authors.

intuition

Quick start

Each framework has its own API under the dropblock package.

PyTorch

from dropblock.torch_dropblock import DropBlock2D
layer = DropBlock2D(block_size=5, keep_prob=0.9)

TensorFlow / Keras

from dropblock.tf_dropblock import DropBlock2D
layer = DropBlock2D(block_size=5, keep_prob=0.9)

JAX

from dropblock.jax_dropblock import dropblock2d
output = dropblock2d(x, block_size=5, keep_prob=0.9, training=True)

Testing

Run unit tests (they automatically skip if the corresponding framework is not installed):

pytest -q

About

Paper Reproduction: "DropBlock: A regularization method for convolutional networks"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors