Filters








43,361 Hits in 4.0 sec

Enhanced Gradient for Differentiable Architecture Search [article]

Haichao Zhang, Kuangrong Hao, Lei Gao, Xuesong Tang, Bing Wei
2021 arXiv   pre-print
At the stage of block-level search, a relaxation method based on the gradient is proposed, using an enhanced gradient to design high-performance and low-complexity blocks.  ...  In recent years, neural architecture search (NAS) methods have been proposed for the automatic generation of task-oriented network architecture in image classification.  ...  We propose a multi-objective NAS algorithm-enhanced gradient for differentiable architecture search (EG-DARTS).  ... 
arXiv:2103.12529v1 fatcat:ahe23t3vlrgizkaxhadqztltg4

De-IReps: Searching for improved Re-parameterizing Architecture based on Differentiable Evolution Strategy [article]

Xinyi Yu, Xiaowei Wang, Mingyang Zhang, Jintao Rong, Linlin Ou
2022 arXiv   pre-print
We visualize the features of the searched architecture and give our explanation for the appearance of this architecture.  ...  Meanwhile we summarize the characteristics of the re-parameterization search space and propose a differentiable evolutionary strategy (DES) to explore the re-parameterization search space.  ...  Gradient-based neural architecture search benefits from the introduction of differentiable functions, which transform the discrete search space into continuous so that it can be optimized by means of gradients  ... 
arXiv:2204.06403v1 fatcat:bs4wqvlmofecpagwe6upcsddv4

TND-NAS: Towards Non-Differentiable Objectives in Differentiable Neural Architecture Search [article]

Bo Lyu, Shiping Wen, Zheng Yan, Kaibo Shi, Ke Li, Tingwen Huang
2022 arXiv   pre-print
Differentiable architecture search has gradually become the mainstream research topic in the field of Neural Architecture Search (NAS) for its high efficiency compared with the early NAS (EA-based, RL-based  ...  Under the differentiable NAS framework, with the continuous relaxation of the search space, TND-NAS has the architecture parameters (α) been optimized in discrete space, while resorting to the progressive  ...  architecture is viewed as the reward for policy-gradient optimization.  ... 
arXiv:2111.03892v2 fatcat:wgn45wznjvcntp6evov4so25gm

BNAS-v2: Memory-efficient and Performance-collapse-prevented Broad Neural Architecture Search [article]

Zixiang Ding, Yaran Chen, Nannan Li, Dongbin Zhao
2021 arXiv   pre-print
For this consequent issue, two solutions are given: 1) we propose Confident Learning Rate (CLR) that considers the confidence of gradient for architecture weights update, increasing with the training time  ...  and 2) the proposed CLR is effective to alleviate the performance collapse issue in both BNAS-D and vanilla differentiable NAS framework.  ...  For architecture search, there is a loop as below. 1) Architecture weights update: 1 An validation mini-batch is fed into the broad scalable architecture, to 2 obtain the loss for 3 computing the gradient  ... 
arXiv:2009.08886v4 fatcat:ogamx2zj4jbd5nvbw3sheausse

Differentiable Architecture Search Based on Coordinate Descent

Pyunghwan Ahn, Hyeong Gwon Hong, Junmo Kim
2021 IEEE Access  
ACKNOWLEDGMENT This work was conducted by Center for Applied Research in Artificial Intelligence(CARAI) grant funded by Defense Acquisition Program Administration(DAPA) and Agency for Defense Development  ...  This technique enhances the performance while showing the desired search behavior of differentiable search algorithms.  ...  To allow a differentiable architecture search to be more meaningful, the search time should be improved further.  ... 
doi:10.1109/access.2021.3068766 fatcat:dnzw3452vrgtdlcx4z7h4nrkmu

Gradient Descent Effects on Differential Neural Architecture Search: A Survey

Santanu Santra, Jun-Wei Hsieh, Chi-Fang Lin
2021 IEEE Access  
In recent trends, the neural architecture search (NAS) is enormously used to construct an automatic architecture for a specific task.  ...  Gradient Descent, an effective way to search for the local minimum of a function, can minimize training and validation loss of neural architectures and also be incited in an appropriate order to decrease  ...  DAS [28] converts the discrete network architecture search space into a continuously differentiable one from which gradient optimization can be applied for architecture search.  ... 
doi:10.1109/access.2021.3090918 fatcat:v5yrxbpzjvdozauevowlxsjeya

Stacked BNAS: Rethinking Broad Convolutional Neural Network for Neural Architecture Search [article]

Zixiang Ding, Yaran Chen, Nannan Li, Dongbin Zhao
2021 arXiv   pre-print
Different from other deep scalable architecture based NAS approaches, Broad Neural Architecture Search (BNAS) proposes a broad one which consists of convolution and enhancement blocks, dubbed Broad Convolutional  ...  Neural Network (BCNN) as search space for amazing efficiency improvement.  ...  For architecture search, we repeat the implementation five times.  ... 
arXiv:2111.07722v1 fatcat:5gwifx3qbvg6fpgi7uv3e6hinu

Differentiable Architecture Search with Ensemble Gumbel-Softmax [article]

Jianlong Chang, Xinbang Zhang, Yiwen Guo, Gaofeng Meng, Shiming Xiang, Chunhong Pan
2019 arXiv   pre-print
For network architecture search (NAS), it is crucial but challenging to simultaneously guarantee both effectiveness and efficiency.  ...  mechanism of searching network architectures.  ...  Contrary to treating architecture search as black-box optimization problem, gradient based neural architecture search methods utilized the gradient obtained in the training process to optimize neural architecture  ... 
arXiv:1905.01786v1 fatcat:cmu526yij5gujarydki7bwjyx4

Mutually-aware Sub-Graphs Differentiable Architecture Search [article]

Haoxian Tan, Sheng Guo, Yujie Zhong, Matthew R. Scott, Weilin Huang
2021 arXiv   pre-print
Differentiable architecture search is prevalent in the field of NAS because of its simplicity and efficiency, where two paradigms, multi-path algorithms and single-path methods, are dominated.  ...  In this paper, we propose a conceptually simple yet efficient method to bridge these two paradigms, referred as Mutually-aware Sub-Graphs Differentiable Architecture Search (MSG-DAS).  ...  Conclusion We have presented a new searching framework (MSG-DAS) for differentiable architecture search where each searching sub-graph is single-path and mutually exclusive.  ... 
arXiv:2107.04324v3 fatcat:xi47dor6ljdp3iygqhvgcvucwq

Enhanced MRI Reconstruction Network using Neural Architecture Search [article]

Qiaoying Huang, Dong Yang, Yikun Xian, Pengxiang Wu, Jingru Yi, Hui Qu, Dimitris Metaxas
2020 arXiv   pre-print
For each cell in the basic block, we use the differentiable neural architecture search (NAS) technique to automatically choose the optimal operation among eight variants of the dense block.  ...  The cascaded network architecture for MRI reconstruction has been widely used, while it suffers from the "vanishing gradient" problem when the network becomes deep.  ...  for MRI Reconstruction We introduce the EMR-NAS (Enhanced MRI Reconstruction Network via neural architecture search) for automatically determining operations in the blocks.  ... 
arXiv:2008.08248v1 fatcat:ctovelomobbonbekc7wnp73mtu

SNAS: Stochastic Neural Architecture Search [article]

Sirui Xie, Hehui Zheng, Chunxiao Liu, Liang Lin
2020 arXiv   pre-print
To leverage the gradient information in generic differentiable loss for architecture search, a novel search gradient is proposed.  ...  We propose Stochastic Neural Architecture Search (SNAS), an economical end-to-end solution to Neural Architecture Search (NAS) that trains neural operation parameters and architecture distribution parameters  ...  Black dotted lines denote compounded gradients, purple lines for parameter gradients in SNAS, red for search gradients.  ... 
arXiv:1812.09926v3 fatcat:o3zuaa7hxfcvjhktgciodwhghe

DARTS: Differentiable Architecture Search [article]

Hanxiao Liu, Karen Simonyan, Yiming Yang
2019 arXiv   pre-print
representation, allowing efficient search of the architecture using gradient descent.  ...  architectures for language modeling, while being orders of magnitude faster than state-of-the-art non-differentiable techniques.  ...  ACKNOWLEDGEMENTS The authors would like to thank Zihang Dai, Hieu Pham and Zico Kolter for useful discussions. REFERENCES Karim Ahmed and Lorenzo Torresani.  ... 
arXiv:1806.09055v2 fatcat:w5pl7viorrg3lkstu77hmpbzw4

Joint Search of Data Augmentation Policies and Network Architectures [article]

Taiga Kashima, Yoshihiro Yamada, Shunta Saito
2021 arXiv   pre-print
The proposed method combines differentiable methods for augmentation policy search and network architecture search to jointly optimize them in the end-to-end manner.  ...  In this paper, we propose a joint optimization method for data augmentation policies and network architectures to bring more automation to the design of training pipeline.  ...  The proposed method combines differentiable methods for policy search and architecture search to jointly optimize them in the end-to-end manner.  ... 
arXiv:2012.09407v2 fatcat:otozvh2rz5btfn5hd7cacgei4y

Exploiting Operation Importance for Differentiable Neural Architecture Search [article]

Xukai Xie, Yuan Zhou, Sun-Yuan Kung
2019 arXiv   pre-print
Recently, differentiable neural architecture search methods significantly reduce the search cost by constructing a super network and relax the architecture representation by assigning architecture weights  ...  To alleviate this deficiency, we propose a simple yet effective solution to neural architecture search, termed as exploiting operation importance for effective neural architecture search (EoiNAS), in which  ...  Contrary to treating architecture search as a black-box optimization problem, gradient based neural architecture search methods [23, 36, 7] utilized the gradient obtained in the training process to optimize  ... 
arXiv:1911.10511v1 fatcat:yfcm5cttvzhrzmldlytoj7jole

Fine-Tuning DARTS for Image Classification [article]

Muhammad Suhaib Tanveer, Muhammad Umar Karim Khan, Chong-Min Kyung
2020 arXiv   pre-print
Neural Architecture Search (NAS) has gained attraction due to superior classification performance. Differential Architecture Search (DARTS) is a computationally light method.  ...  Gradient-based search using Differentiable Architecture Sampler (GDAS) [10] develops a differentiable sampler over the Directed acyclic graph (DAG) to avoid traversing all the possibilities of the sub-graphs  ...  DARTS and its derivatives Differentiable Architecture Search (DARTS) [4] introduces a differentiable and continuous search space instead of a discrete search space and achieves remarkable efficiency,  ... 
arXiv:2006.09042v1 fatcat:5d32mcxmbfbb7go472246b4ap4
« Previous Showing results 1 — 15 out of 43,361 results