A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Filters
AutoHAS: Efficient Hyperparameter and Architecture Search
[article]
2021
arXiv
pre-print
In this work, we propose a unified pipeline, AutoHAS, to efficiently search for both architectures and hyperparameters. ...
In experiments, we show AutoHAS is efficient and generalizable to different search spaces, baselines and datasets. ...
Acknowledgements We want to thank Gabriel Bender, Hanxiao Liu, Hieu Pham, Ruoming Pang, Barret Zoph and Yanqi Zhou for their help and feedback. ...
arXiv:2006.03656v3
fatcat:y3socjtfh5bvrpukno5kmasmwy
DHA: End-to-End Joint Optimization of Data Augmentation Policy, Hyper-parameter and Architecture
[article]
2021
arXiv
pre-print
Automated machine learning (AutoML) usually involves several crucial components, such as Data Augmentation (DA) policy, Hyper-Parameter Optimization (HPO), and Neural Architecture Search (NAS). ...
In view of these, we propose DHA, which achieves joint optimization of Data augmentation policy, Hyper-parameter and Architecture. ...
Conventional neural architecture search methods perform a search over a fixed set of architecture candidates and then apply or search for a separate set of hyper-parameters when retraining the best architecture ...
arXiv:2109.05765v1
fatcat:chsuylzfrfeg5h5txj7xualooi
AutoML: A Survey of the State-of-the-Art
[article]
2020
arXiv
pre-print
First, we introduce AutoML methods according to the pipeline, covering data preparation, feature engineering, hyperparameter optimization, and neural architecture search (NAS). ...
NAS, and joint hyperparameter and architecture optimization. ...
Joint Hyper-parameter and Architecture Optimization Most NAS methods fix the same setting of trainingrelated hyper-parameters during the whole search stage. ...
arXiv:1908.00709v5
fatcat:zwlhvujqnzgxja42t2yk75bsx4
Meta-Learning of NAS for Few-shot Learning in Medical Image Applications
[article]
2022
arXiv
pre-print
Even though it has been shown that network architecture plays a critical role in learning feature representation feature from data and the final performance, searching for the best network architecture ...
Automated machine learning (AutoML) and its advanced techniques i.e. Neural Architecture Search (NAS) have been promoted to address those limitations. ...
OIA-1946391; partially funded by Gia Lam Urban Development and Investment Company Limited, Vingroup and supported by Vingroup Innovation Foundation (VINIF) under project code VINIF.2019.DA19. ...
arXiv:2203.08951v1
fatcat:rafuwlli4reabfrygeed4e6o6y
Mixed Variable Bayesian Optimization with Frequency Modulated Kernels
[article]
2021
arXiv
pre-print
On joint optimization of neural architectures and SGD hyperparameters, BO-FM outperforms competitors including Regularized evolution(RE) and BOHB. ...
Therefore, we specify and prove conditions for FM kernels to be positive definite and to exhibit the similarity measure behavior. ...
Xuanyi Dong, Mingxing Tan, Adams Wei Yu, Daiyi Peng,
Bogdan Gabrys, and Quoc V Le. Autohas: Efficient
hyperparameter and architecture search. arXiv preprint
arXiv:2006.03656, 2020. ...
arXiv:2102.12792v1
fatcat:ptotp3bbpfcsdfceizi7donxva
AutonoML: Towards an Integrated Framework for Autonomous Machine Learning
[article]
2022
arXiv
pre-print
In doing so, we survey developments in the following research areas: hyperparameter optimisation, multi-component models, neural architecture search, automated feature engineering, meta-learning, multi-level ...
Ultimately, we conclude that the notion of architectural integration deserves more discussion, without which the field of automated ML risks stifling both its technical advantages and general uptake. ...
Differentiable ARchiTecture Search (DARTS) is an archetype of this strategy [242] , which eschews discretisation and aims to relax network representation into a continuous space. ...
arXiv:2012.12600v2
fatcat:6rj4ubhcjncvddztjs7tql3itq