A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
MetaPruning: Meta Learning for Automatic Neural Network Channel Pruning
[article]
2019
arXiv
pre-print
In this paper, we propose a novel meta learning approach for automatic channel pruning of very deep neural networks. ...
We first train a PruningNet, a kind of meta network, which is able to generate weight parameters for any pruned structure given the target network. ...
In this section, we introduce our meta learning approach for automatically pruning channels in deep neural networks, that pruned network could meet various constraints easily. ...
arXiv:1903.10258v3
fatcat:7fp5bmuduzcapgtvtjjp6tkt2a
Graph Pruning for Model Compression
[article]
2021
arXiv
pre-print
Subsequently, the best configuration of the Pruned Network is searched by reinforcement learning. ...
Any series of the network is viewed as a graph. To automatically aggregate neighboring features for each node, a graph aggregator based on graph convolution networks(GCN) is designed. ...
. • We combine Graph PruningNet with meta-learning for training and reinforcement learning for searching. ...
arXiv:1911.09817v2
fatcat:p6ivhwdtqrdrlgocu5tymxqoci
DHP: Differentiable Meta Pruning via HyperNetworks
[article]
2020
arXiv
pre-print
Network pruning has been the driving force for the acceleration of neural networks and the alleviation of model storage/transmission burden. ...
To circumvent this problem, this paper introduces a differentiable pruning method via hypernetworks for automatic network pruning. ...
.: MetaPruning: Meta learning for automatic neural network channel pruning. In: Proc. ...
arXiv:2003.13683v3
fatcat:2haocmohkzeklcpbmpruq45v4m
Evolving Transferable Pruning Functions
[article]
2021
arXiv
pre-print
Channel pruning has made major headway in the design of efficient deep learning models. ...
Conventional approaches adopt human-made pruning functions to score channels' importance for channel pruning, which requires domain knowledge and could be sub-optimal. ...
While these works use handcrafted scoring metrics, we learn transferable and generalizable pruning functions automatically. Meta-Learning. ...
arXiv:2110.10876v1
fatcat:buewobh6wffmzngs7iasn7wsty
Joint Multi-Dimension Pruning via Numerical Gradient Update
[article]
2021
arXiv
pre-print
We present joint multi-dimension pruning (abbreviated as JointPruning), an effective method of pruning a network on three crucial aspects: spatial, depth and channel simultaneously. ...
size, depth) and construct a unique mapping from the pruning vector to the pruned network structures. ...
or a meta network [37] can automatically decide the best pruning ratio. ...
arXiv:2005.08931v2
fatcat:yvnze4n7kzh43hnpj7emq5ye6u
Revisiting Parameter Sharing for Automatic Neural Channel Number Search
2020
Neural Information Processing Systems
Recent advances in neural architecture search inspire many channel number search algorithms (CNS) for convolutional neural networks. ...
In this paper, we aim at providing a better understanding and exploitation of parameter sharing for CNS. ...
For a L-layer neural network, layerwise channel number decisions are sampled from ⇡(✓), i.e. a = [a 1 , ..., a L ] ⇠ ⇡(✓), where a l 2 A = {1, 2, ..., A}, and A represents the index set of channel number ...
dblp:conf/nips/WangBWSHKL020
fatcat:oodjqieqqzbqxln6obpxvkxyoy
Differentiable Network Pruning via Polarization of Probabilistic Channelwise Soft Masks
2022
Computational Intelligence and Neuroscience
Channel pruning has been demonstrated as a highly effective approach to compress large convolutional neural networks. ...
For instance, our method prunes 65.91% FLOPs of ResNet50 on the ImageNet dataset with only 0.7% model accuracy degradation. ...
[22] propose meta-attribute-based filter pruning (MFP), which adaptively selects the most appropriate pruning standard through an attribute (meta-attribute) of the current state of the neural network ...
doi:10.1155/2022/7775419
pmid:35571691
pmcid:PMC9098282
fatcat:r4kalfper5bdfhcxhqc63td5pe
Discrimination-aware Network Pruning for Deep Model Compression
[article]
2020
arXiv
pre-print
We study network pruning which aims to remove redundant channels/kernels and hence speed up the inference of deep networks. ...
To prevent DCP/DKP from selecting redundant channels/kernels, we propose a new adaptive stopping condition, which helps to automatically determine the number of selected channels/kernels and often results ...
Apart from these methods, the pruning ratio for each layer can also be automatically determined by reinforcement learning [25] , [71] or meta-learning [48] . ...
arXiv:2001.01050v1
fatcat:vdy4g473czcdphd2wyqegiboyu
Dynamic Slimmable Network
[article]
2021
arXiv
pre-print
Current dynamic networks and dynamic pruning methods have shown their promising capability in reducing theoretical computation complexity. ...
In the first stage, a novel training technique for weight-sharing networks named In-place Ensemble Bootstrapping is proposed to improve the supernet training efficacy. ...
Remarkably, DS-MBNet outperforms the SOTA pruning methods EagleEye [40] and Meta-Pruning [51] by 1.9% and 2.2%. ...
arXiv:2103.13258v1
fatcat:q3dqpaakf5bilikvsn7akzhone
Building Efficient CNNs Using Depthwise Convolutional Eigen-Filters (DeCEF)
[article]
2022
arXiv
pre-print
Deep Convolutional Neural Networks (CNNs) have been widely used in various domains due to their impressive capabilities. ...
To reduce the complexity of a network, compression techniques can be applied. These methods typically rely on the analysis of trained deep learning models. ...
[40] proposes a meta network, which is able to generate weight parameters for any pruned structure given the target network, which can be used to search for goodperforming pruned networks. [58] introduce ...
arXiv:1910.09359v3
fatcat:frtr2r52x5bfvpca44s337eiiy
SNF: Filter Pruning via Searching the Proper Number of Filters
[article]
2021
arXiv
pre-print
neural networks' redundancy. ...
Convolutional Neural Network (CNN) has an amount of parameter redundancy, filter pruning aims to remove the redundant filters and provides the possibility for the application of CNN on terminal devices ...
Metapruning [36] utilized meta-learning to train a pruning network which provides weights for all the possible sub-networks, and then searched for the best pruned network structures. Pavlo et al. ...
arXiv:2112.07282v1
fatcat:4p3trr6i6nhobbjbdtyak3g3vi
ResRep: Lossless CNN Pruning via Decoupling Remembering and Forgetting
[article]
2021
arXiv
pre-print
We propose ResRep, a novel method for lossless channel pruning (a.k.a. filter pruning), which slims down a CNN by reducing the width (number of output channels) of convolutional layers. ...
maintain the performance and the latter learn to prune. ...
biological neural network, which resembles pruning. ...
arXiv:2007.03260v4
fatcat:6pyaguorqfgexmvttzzp6wxhre
Adjoined Networks: A Training Paradigm with Applications to Network Compression
[article]
2022
arXiv
pre-print
We further propose Differentiable Adjoined Networks (DAN), a training paradigm that augments AN by using neural architecture search to jointly learn both the width and the weights for each layer of the ...
In this paper, we introduce Adjoined Networks, or AN, a learning paradigm that trains both the original base network and the smaller compressed network together. ...
A recent study Meta Pruning [29] searches over the number of channels in each layer. It generates weights for all candidates and then selects the architecture with the highest validation accuracy. ...
arXiv:2006.05624v5
fatcat:chp5hevvbrdvlc65qpipa4xkgy
DS-Net++: Dynamic Weight Slicing for Efficient Inference in CNNs and Transformers
[article]
2021
arXiv
pre-print
Here, we explore a hardware-efficient dynamic inference regime, named dynamic weight slicing, which adaptively slice a part of network parameters for inputs with diverse difficulty levels, while keeping ...
Based on this scheme, we present dynamic slimmable network (DS-Net) and dynamic slice-able network (DS-Net++) by input-dependently adjusting filter numbers of CNNs and multiple dimensions in both CNNs ...
Channel Gating Neural Network [26] and FBS [27] identify and skip the unimportant input channels at run-time. ...
arXiv:2109.10060v1
fatcat:s5nmhmobsjc7tkstsiwqj5qesu
Provable Filter Pruning for Efficient Neural Networks
[article]
2020
arXiv
pre-print
We present a provable, sampling-based approach for generating compact Convolutional Neural Networks (CNNs) by identifying and removing redundant filters from an over-parameterized network. ...
In contrast to existing filter pruning approaches, our method is simultaneously data-informed, exhibits provable guarantees on the size and performance of the pruned network, and is widely applicable to ...
Metapruning: Meta learning for automatic neural network channel pruning. In Proceedings
of the IEEE International Conference on Computer Vision, pp. 3296-3305, 2019a. ...
arXiv:1911.07412v2
fatcat:l5drcoblgvdxfcksho5g7inhue
« Previous
Showing results 1 — 15 out of 27 results