A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Filters
Generic Neural Architecture Search via Regression
[article]
2021
arXiv
pre-print
Extensive experiments across 13 CNN search spaces and one NLP space demonstrate the remarkable efficiency of GenNAS using regression, in terms of both evaluating the neural architectures (quantified by ...
Most existing neural architecture search (NAS) algorithms are dedicated to and evaluated by the downstream tasks, e.g., image classification in computer vision. ...
To answer the two fundamental questions for NAS, in this work, we propose a Generic Neural Architecture Search method, termed GenNAS. ...
arXiv:2108.01899v2
fatcat:im7t62hb4zhdhga6saihesmywm
Neural Architecture Optimization with Graph VAE
[article]
2020
arXiv
pre-print
Due to their high computational efficiency on a continuous space, gradient optimization methods have shown great potential in the neural architecture search (NAS) domain. ...
Extensive experiments demonstrate our framework not only generates appropriate continuous representations but also discovers powerful neural architectures. ...
NAS discovers promising architectures via optimization on the search space, outperforming handcrafted networks [9, 10, 12] . There are two key issues: 1) Representation. ...
arXiv:2006.10310v1
fatcat:knuhg6vd6fei5bxtrri2mcx2tq
Conditional Neural Architecture Search
[article]
2020
arXiv
pre-print
This is the first work of bringing in condition and adversarial technique into Neural Architecture Search domain. We verify the method with regression problems and classification on CIFAR-10. ...
We propose a conditional neural architecture search method using GAN, which produces feasible ML models for different platforms. We present a new workflow to generate constraint-optimized DNN models. ...
We also present an Inverse Neural Architecture Generation (INAG) workflow, which demonstrates a practical design flow when applying conditional Neural Architecture Search method to the resource constraint ...
arXiv:2006.03969v1
fatcat:5yqd4rwhybhopomzxcmkdzku6q
Evolutionary Neural Architecture Search and Applications [Guest Editorial]
2021
IEEE Computational Intelligence Magazine
In the fourth paper, "Forecasting Wind Speed Time Series Via Dendritic Neural Regression" by J. ...
Performance Evolutionary Neural Architecture Search and Applications preceptors are a kind of regression model that can directly predict the performance of a neural network without any training and is ...
doi:10.1109/mci.2021.3084391
fatcat:wqwat7lygzao7fkzd2gyem7icu
AutoCoMet: Smart Neural Architecture Search via Co-Regulated Shaping Reinforcement
[article]
2022
arXiv
pre-print
Though Neural Architecture Search (NAS/AutoML) has made this easier by shifting paradigm from extensive manual effort to automated architecture learning from data, yet it has major limitations, leading ...
Our novel co-regulated shaping reinforcement controller together with the high fidelity hardware meta-behavior predictor produces a smart, fast NAS framework that adapts to context via a generalized formalism ...
in a principled manner as well as extremely rapid search and (4) Finally, we demonstrate empirically in several important scenarios how our framework is effective in generating suitable neural architectures ...
arXiv:2203.15408v1
fatcat:hbbu7msc5jhoxon6ihuwus53e4
AutoSNAP: Automatically Learning Neural Architectures for Instrument Pose Estimation
[article]
2020
arXiv
pre-print
architecture using an efficient search scheme. ...
Using AutoSNAP, we discover an improved architecture (SNAPNet) which outperforms both the hand-engineered i3PosNet and the state-of-the-art architecture search method DARTS. ...
Framing this prediction task as our environment, we search for a neural architecture (green network in Fig. 1 ) that minimizes the Mean-Squared Error of the point regression task (regMSE). i3PosNet, on ...
arXiv:2006.14858v1
fatcat:hlvmfq6ptvfabl4hdxnmtvpgcy
Neural Architecture Optimization
[article]
2019
arXiv
pre-print
Automatic neural architecture design has shown its potential in discovering powerful neural network architectures. ...
of previous architecture search methods with a significantly reduction of computational resources. ...
Furthermore, on PTB dataset we achieve 56.0 perplexity, also surpassing best performance found via previous methods on neural architecture search. ...
arXiv:1808.07233v5
fatcat:auv5pkd4r5ayxnecmcbqnxc5y4
Hyperparameter optimization with REINFORCE and Transformers
[article]
2020
arXiv
pre-print
Reinforcement Learning has yielded promising results for Neural Architecture Search (NAS). ...
As a generic HPO algorithm, it outperformed Random Search in discovering more accurate multi-layer perceptron model architectures across 2 regression tasks. ...
Neural Architecture Search (NAS) is a special type of HPO problem where the focus is on algorithm driven design of neural network architecture components or cells [8] . ...
arXiv:2006.00939v4
fatcat:6idg55t6ebbz7oi2aj3fgq36d4
Stealing Neural Networks via Timing Side Channels
[article]
2019
arXiv
pre-print
Although, constructing an equivalent architecture is a complex search problem, it is shown how Reinforcement Learning with knowledge distillation can effectively reduce the search space to infer a target ...
Here, an adversary can extract the Neural Network parameters, infer the regularization hyperparameter, identify if a data point was part of the training data, and generate effective transferable adversarial ...
Regression. ...
arXiv:1812.11720v4
fatcat:hts4m64pabh37fp2jalgkoysu4
A Framework for Selecting Deep Learning Hyper-parameters
[chapter]
2015
Lecture Notes in Computer Science
Our work provides a framework for building deep learning architectures via a stepwise approach, together with an evaluation methodology to quickly identify poorly performing architectural configurations ...
Recent research has found that deep learning architectures show significant improvements over traditional shallow algorithms when mining high dimensional datasets. ...
As with the other machines in our architecture and for artificial neural networks (ANN) in general, each node in a hidden layer L (i) is connected to every node in layer L (i−1) and every node in L (i+ ...
doi:10.1007/978-3-319-20424-6_12
fatcat:kvyqioanjbhjfhohoimj532ety
AutoDEUQ: Automated Deep Ensemble with Uncertainty Quantification
[article]
2022
arXiv
pre-print
We propose AutoDEUQ, an automated approach for generating an ensemble of deep neural networks. Our approach leverages joint neural architecture and hyperparameter search to generate ensembles. ...
However, building ensembles of neural networks is a challenging task because, in addition to choosing the right neural architecture or hyperparameters for each member of the ensemble, there is an added ...
theoretical insights into the quality of epistemic uncertainty under the various data generation assumptions.
V. ACKNOWLEDGEMENT ...
arXiv:2110.13511v3
fatcat:wfol5b7pdba7xjf5q4ojsecc6q
A Generic Graph-based Neural Architecture Encoding Scheme for Predictor-based NAS
[article]
2020
arXiv
pre-print
This work proposes a novel Graph-based neural ArchiTecture Encoding Scheme, a.k.a. GATES, to improve the predictor-based neural architecture search. ...
GATES is a more reasonable modeling of the neural architectures, and can encode architectures from both the "operation on node" and "operation on edge" cell search spaces consistently. ...
.: Multi-objective neural architecture search via predictive network performance optimization. arXiv preprint arXiv:1911.09336 (2019) 23. ...
arXiv:2004.01899v3
fatcat:aofyv24iozhfvgej5esppusylq
MAPLE: Microprocessor A Priori for Latency Estimation
[article]
2022
arXiv
pre-print
As such, neural architecture search (NAS) algorithms take these two constraints into account when generating a new architecture. ...
Through this quantitative strategy as the hardware descriptor, MAPLE can generalize to new hardware via a few shot adaptation strategy where with as few as 3 samples it exhibits a 6% improvement over state-of-the-art ...
Regression Model Architecture The proposed method employs a compact neural network-based non-linear regression model for latency inference. ...
arXiv:2111.15106v2
fatcat:w2wrz6oygnfflagyxwir5vgxo4
2021 Index IEEE Computational Intelligence Magazine [2021 MCI Year-End Index]
2021
IEEE Computational Intelligence Magazine
., +, MCI Aug. 2021 33-49 Predictive models Forecasting Wind Speed Time Series Via Dendritic Neural Regression. ...
Chen, C., +, MCI May 2021 78-98 Prediction algorithms Forecasting Wind Speed Time Series Via Dendritic Neural Regression. ...
doi:10.1109/mci.2021.3121525
fatcat:w7wra3t4lfgcnfqzxn2aexrluq
intelligent automated monitoring system using video surveillance based recognition
2022
International Journal of Health Sciences
Here, CNN models are to be used for real-time object detection via the CCTV camera. ...
Both the blocks have 3 layers, as shown in Table I : Neural Architecture Search Network (NASNet), searches for an architectural building block over a trivial dataset and shifts it to a vast dataset [ ...
Instead, these cells are searched via the reinforcement learning search algorithm wherein the number of initial convolution filters and the number of motif repetitions(N) are free parameters and are used ...
doi:10.53730/ijhs.v6ns6.10098
fatcat:5dybiq3g6fbjhjkfurdjhthtiy
« Previous
Showing results 1 — 15 out of 42,141 results