Jump to content

Pruning (artificial neural network)

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Ira Leviton (talk | contribs) at 13:11, 7 February 2023 (Fixed a pmc parameter in a citation. Please see Category:CS1 maint: PMC format.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.


In the context of artificial neural network, pruning is the practice of removing parameters (which may entail removing individual parameters, or parameters in groups such as by neurons) from an existing network.[1] The goal of this process is to maintain accuracy of the network while increasing its efficiency. This can be done to reduce the computational resources required to run the neural network.

A basic algorithm for pruning is as follows:[2][3]

  1. Evaluate the importance of each neuron.
  2. Rank the neurons according to their importance (assuming there is a clearly defined measure for "importance").
  3. Remove the least important neuron.
  4. Check a termination condition (to be determined by the user) to see whether to continue pruning.

Recently a highly pruning three layer tree architecture, has achieved a similar success rate to that of LeNet-5 on the CIFAR-10 dataset with a lesser computational complexity.[4]

References

  1. ^ Blalock, Davis; Ortiz, Jose Javier Gonzalez; Frankle, Jonathan; Guttag, John (2020-03-06). "What is the State of Neural Network Pruning?". arXiv:2003.03033 [cs.LG].
  2. ^ Molchanov, P., Tyree, S., Karras, T., Aila, T., & Kautz, J. (2016). Pruning convolutional neural networks for resource efficient inference. arXiv preprint arXiv:1611.06440.
  3. ^ Pruning deep neural networks to make them fast and small.
  4. ^ Meir, Yuval; Ben-Noam, Itamar; Tzach, Yarden; Hodassman, Shiri; Kanter, Ido (2023-01-30). "Learning on tree architectures outperforms a convolutional feedforward network". Scientific Reports. 13 (1): 962. doi:10.1038/s41598-023-27986-6. ISSN 2045-2322. PMC 9886946. PMID 36717568.