Skip to content

A novel generative approach for controllable part-based generation of objects from multiple categories, all using a single unified model.

License

Notifications You must be signed in to change notification settings

atmacvit/meronymnet

Repository files navigation

MeronymNet

A Hierarchical Model for Unified and Controllable Multi-Category Object Generation

We introduce MeronymNet, a novel hierarchical approach for con- trollable, part-based generation of multi-category objects using a single unified model. We adopt a guided coarse-to-fine strategy involving semantically conditioned generation of bounding box layouts, pixel-level part layouts and ultimately, the object depic- tions themselves. We use Graph Convolutional Networks, Deep Recurrent Networks along with custom-designed Conditional Vari- ational Autoencoders to enable flexible, diverse and category-aware generation of 2-D objects in a controlled manner. The performance scores for generated objects reflect MeronymNet’s superior perfor- mance compared to multiple strong baselines and ablative variants.

Paper

 

Code

Prerequisites:

  • NVIDIA GPU + CUDA CuDNN
  • Python 3.6
  • TensorFlow 1.15
  • PyTorch 1.0
  • Please install dependencies by
pip install -r requirements.txt

After installing the following dependencies run the bash scripts given.

The code for the training and inference our models is in the directory Meronymnet, repurposed baselines can be found in baselines and for metrics and visualisation please go to the directory experiment_scripts.

To preprocess and get the data run.

sh preprocess.sh

To train the model run the train script.

sh train.sh

To get the generations from inference files, get the pretrained models in the directory and run inference script.

sh inference.sh

Datasets

PascalParts

About

A novel generative approach for controllable part-based generation of objects from multiple categories, all using a single unified model.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published