Skip to content

usamec/double_sparse

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Double sparse pruning

Code for the ICLR 2025 paper: Two Sparse Matrices are Better than One: Sparsifying Neural Networks with Double Sparse Factorization.

The repository is based on SparseGPT code

Dependencies

  • torch: tested on v2.2.1
  • transformers: tested on v4.35.2
  • datasets: tested on v2.16.1

Usage

We also provide LLaMA pruning script with the very same interface:

# Sparsify LLaMa with SparseGPT
python llama.py meta-llama/Llama-2-7b-hf c4 --sparsity 0.5

Other experiments

For replicating other experiments (comparision with OBC a post-training pruning with finetuning) see other_experiments directory.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published