Skip to content

This repository is the official implementation of "Partial Channel Network: Compute Fewer, Perform Better", which includes training, evaluation, and other related scripts.

License

Notifications You must be signed in to change notification settings

haiduo/PartialNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PartialNet

This repository is the official implementation of "Partial Channel Network: Compute Fewer, Perform Better", which includes training, evaluation, and other related scripts.

image
Figure 1: Comparison of different operation types.

image
Figure 2: The overall architecture of our PartialNet.

image
Figure 3: Our PartialNet achieves higher trade-off of accuracy and throughput on ImageNet-1K.

📋 Designing a network module that maintains low parameters and FLOPs without sacrificing accuracy and throughput is challenging. To address this, we propose the Partial Channel Mechanism (PCM), which splits feature map channels into parts for different operations like convolution, attention, pooling, and identity mapping. Based on this, we introduce Partial Attention Convolution (PATConv), as depicted in Figure 2, which efficiently combines convolution with visual attention, reducing parameters and FLOPs while maintaining performance. PATConv gives rise to three new blocks: Partial Channel-Attention (PAT_ch), Partial Spatial-Attention (PAT_sp), and Partial Self-Attention (PAT_sf). Additionally, we propose Dynamic Partial Convolution (DPConv), which adaptively learns channel splits across layers for optimal trade-offs. Together, PATConv and DPConv form the PartialNet hybrid network family, as depicted in Figure 2, which outperforms SOTA models in both ImageNet-1K classification and COCO detection and segmentation, as depicted in Figure 3.

The structure of code

  • code
    • models   -->The core scripts of related network models.
    • cfg   -->The different variants of our PartialNet.
    • data   -->The dataset processing enhancements.
    • detectione   -->Related scripts for detection and segmentation.
  • data
    • cifar10
    • imagenet
    • coco2017

Requirements

We have tested the code on the following environments and settings:

  • Python 3.10.13 / Pytorch (>=1.6.0) / torchvision (>=0.7.0)
  • Prepare ImageNet-1k data following pytorch example.
  • Prepare coco2017 data following pytorch example.

To install requirements:

pip install -r requirements.txt

📋 Set up the environment, e.g. we use conda to build our code.

Training

To train the model(s) in the paper, run this command:

For classification:

Cifar10:
    python code/PartialNet/train_test.py --gpus [0,1,2,3] --cfg code/PartialNet/cfg/cifar10-PartialNet_to.yaml.yaml ... etc.
ImageNet:
    python code/PartialNet/train_test.py --gpus [0,1,2,3] --cfg code/PartialNet/cfg/PartialNet_t0.yaml ... etc.

For detection and segmentation:

    python code/PartialNet/detection/train.py  --gpus [0,1,2,3] --cfg code/PartialNet/cfg/PartialNet_t0.yaml ... etc.

Evaluation

To evaluate my model on dataset, you need to add "--test_phase" and "--checkpoint_path", run:

python code/PartialNet/train_test.py --test_phase --checkpoint_path "your have trained checkpoint path" --gpus [0,1,2,3] --cfg code/PartialNet/cfg/cifar10-PartialNet_to.yaml.yaml ... etc.

Pre-trained Models

You can download pretrained models here:

  • coming soon.

Results

Please refer to our paper.

Reference

For technical details and full experimental results, please check the paper of PartialNet.

@misc{huang2025partialchannelnetworkcompute,
      title={Partial Channel Network: Compute Fewer, Perform Better}, 
      author={Haiduo Huang and Tian Xia and Wenzhe zhao and Pengju Ren},
      year={2025},
      eprint={2502.01303},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2502.01303}, 
}

About

This repository is the official implementation of "Partial Channel Network: Compute Fewer, Perform Better", which includes training, evaluation, and other related scripts.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages