76,266 Hits in 9.7 sec

An effective algorithm for hyperparameter optimization of neural networks [article]

Gonzalo Diaz, Achille Fokoue, Giacomo Nannicini, Horst Samulowitz
2017 arXiv   pre-print
A major challenge in designing neural network (NN) systems is to determine the best structure and parameters for the network given the data for the machine learning problem at hand.  ...  This paper addresses the problem of choosing appropriate parameters for the NN by formulating it as a box-constrained mathematical optimization problem, and applying a derivative-free optimization tool  ...  In the fourth section, we discuss how derivative-free optimization can be applied to hyperparameter optimization, more specifically in the context of neural networks, and provide an overview of our approach  ... 
arXiv:1705.08520v1 fatcat:zmmfiiicozau5h2ocbgxqbn27u

Hyperparameter Optimization in Binary Communication Networks for Neuromorphic Deployment [article]

Maryam Parsa, Catherine D. Schuman, Prasanna Date, Derek C. Rose, Bill Kay, J. Parker Mitchell, Steven R. Young, Ryan Dellana, William Severa, Thomas E. Potok, Kaushik Roy
2020 arXiv   pre-print
In this work, we introduce a Bayesian approach for optimizing the hyperparameters of an algorithm for training binary communication networks that can be deployed to neuromorphic hardware.  ...  We show that by optimizing the hyperparameters on this algorithm for each dataset, we can achieve improvements in accuracy over the previous state-of-the-art for this algorithm on each dataset (by up to  ...  The main contributions of this work are: • A demonstration of the effect of hyperparameters on a training algorithm (Whetstone) that trains neural networks with binary communication. • A Bayesian optimization  ... 
arXiv:2005.04171v1 fatcat:rovwd5e24jhyvabh7pf2hobrzq

Embryo Evaluation Based on ResNet with AdaptiveGA-optimized Hyperparameters

Wenju Zhou Wenju Zhou, Xiaofei Han Wenju Zhou, Yuan Xu Xiaofei Han, Rongfei Chen Yuan Xu, Zhenbo Zhang Rongfei Chen
2022 Journal of Internet Technology  
In addition, an adaptive genetic algorithm is adopted to search for optimal hyperparameters.  ...  In this paper, a residual neural network optimized by the adaptive genetic algorithm is proposed to evaluate embryos.  ...  The process of genetic algorithm optimization Encoding is the primary problem for the genetic algorithm to optimize neural network hyperparameters [27] .  ... 
doi:10.53106/160792642022052303011 fatcat:bv23zhunergtvavhx2apwlqaly

Deep Genetic Network [article]

Siddhartha Dhar Choudhury, Shashank Pandey, Kunal Mehrotra
2019 arXiv   pre-print
Deep Genetic Net uses genetic algorithms along with deep neural networks to address the hyperparameter optimization problem, this approach uses ideas like mating and mutation which are key to genetic algorithms  ...  Using genetic algorithms for this problem proved to work exceptionally well when given enough time to train the network.  ...  The choice of scheme will be evident from the experiments.  ... 
arXiv:1811.01845v2 fatcat:gr5cdickuvf6bo4ml7dhcm4hcm

Pairwise Neural Networks (PairNets) with Low Memory for Fast On-Device Applications [article]

Luna M. Zhang
2020 arXiv   pre-print
Since a large number of hyperparameters of a deep neural network, such as a convolutional neural network, occupy much memory, a memory-inefficient deep learning model is not ideal for real-time Internet  ...  A traditional artificial neural network (ANN) is normally trained slowly by a gradient descent algorithm, such as the backpropagation algorithm, since a large number of hyperparameters of the ANN need  ...  Acknowledgments The author would like to thank the reviewers very much for their valuable comments that help improve the quality of this paper.  ... 
arXiv:2002.04458v1 fatcat:pmparjnmlrcbnidqsxltmu3jni

Handwritten Digit Recognition: Hyperparameters-Based Analysis

Saleh Albahli, Fatimah Alhassan, Waleed Albattah, Rehan Ullah Khan
2020 Applied Sciences  
Such an evaluation will help in selecting the optimized values of hyperparameters for similar tasks.  ...  In this paper, neural network-based architectures are tested based on altering the values of hyperparameters for handwritten-based digit recognition.  ...  Acknowledgments: We would like to thank the Deanship of Scientific Research, Qassim University for funding the publication of this project.  ... 
doi:10.3390/app10175988 fatcat:sfw44dpzj5bmjd2isjbkehv2qm

Automating Analogue AI Chip Design with Genetic Search

Olga Krestinskaya, Khaled N. Salama, Alex P. James
2020 Advanced Intelligent Systems  
The classical software-based approaches for hyperparameter optimization of the deep neural network are a resourceconsuming and complicated task.  ...  We propose to bring the automation in circuit design by using the genetic algorithm for the selection of hyperparameters against hardware-specific objectives.  ...  Acknowledgements Research reported in this publication was supported by the AI Initiative, King Abdullah University of Science and Technology (KAUST).  ... 
doi:10.1002/aisy.202000075 fatcat:3hnp4lwc65ahffnr76cxbzh2le

Convolutional neural network hyperparameter optimization applied to land cover classification

Vladyslav Yaloveha, Andrii Podorozhniak, Heorhii Kuchuk
The previous study proposed a convolutional deep learning neural network for solving land cover classification problems in the EuroSAT dataset.  ...  Therefore, the hyperparameters optimization problem becomes a key issue in a deep learning algorithm.  ...  For given input data X , n hyperparameters   1 2 m n , n n with algorithm A the hyperparameter problem aims to find an optimal configuration of n hyperparameters, which maximizes the performance of  ... 
doi:10.32620/reks.2022.1.09 fatcat:tw4l4qqo6zdiffywsclaapdzda

Recombination of Artificial Neural Networks [article]

Aaron Vose, Jacob Balma, Alex Heye, Alessandro Rigazzi, Charles Siegel, Diana Moise, Benjamin Robbins, Rangan Sukumar
2019 arXiv   pre-print
We propose a genetic algorithm (GA) for hyperparameter optimization of artificial neural networks which includes chromosomal crossover as well as a decoupling of parameters (i.e., weights and biases) from  ...  Our methods improve final accuracy as well as time to fixed accuracy on a wide range of deep neural network architectures including convolutional neural networks, recurrent neural networks, dense neural  ...  The optimization of an RNN with PBT highlights the need for efficient management of checkpoint data, as RNN model sizes are typically larger than those of convolutional neural networks.  ... 
arXiv:1901.03900v1 fatcat:kcxcxrd5lraafeyymfu3pnw5om

PairNets: Novel Fast Shallow Artificial Neural Networks on Partitioned Subspaces [article]

Luna M. Zhang
2020 arXiv   pre-print
Traditionally, an artificial neural network (ANN) is trained slowly by a gradient descent algorithm such as the backpropagation algorithm since a large number of hyperparameters of the ANN need to be fine-tuned  ...  To highly speed up training, we created a novel shallow 4-layer ANN called "Pairwise Neural Network" ("PairNet") with high-speed hyperparameter optimization.  ...  algorithms [19] to try to find optimal hyperparameters of an ANN.  ... 
arXiv:2001.08886v1 fatcat:sxripvwndncvfa2tvbswonxkru

Development of a Multilayer Perception Neural Network for Optimal Predictive Modeling in Urban Microcellular Radio Environments

Joseph Isabona, Agbotiname Lucky Imoize, Stephen Ojo, Olukayode Karunwi, Yongsung Kim, Cheng-Chi Lee, Chun-Ta Li
2022 Applied Sciences  
In detail, the developed MLP model prediction accuracy level using different learning and training algorithms with the tuned best values of the hyperparameters have been applied for extensive path loss  ...  This work develops a distinctive multi-layer perception (MLP) neural network-based path loss model with well-structured implementation network architecture, empowered with the grid search-based hyperparameter  ...  Conflicts of Interest: The authors declare no conflict of interest. Appl. Sci. 2022, 12, 5713  ... 
doi:10.3390/app12115713 fatcat:b434kacgkbcn3pluuzw7lbcv64

Sample-Efficient Automated Deep Reinforcement Learning [article]

Jörg K.H. Franke, Gregor Köhler, André Biedenkapp, Frank Hutter
2021 arXiv   pre-print
needed for meta-optimization by up to an order of magnitude compared to population-based training.  ...  In this framework, we optimize the hyperparameters and also the neural architecture while simultaneously training the agent.  ...  ACKNOWLEDGMENT The authors acknowledge funding by the Robert Bosch GmbH, as well as support by the state of Baden-Württemberg through bwHPC and the German Research Foundation (DFG) through grant no INST  ... 
arXiv:2009.01555v3 fatcat:ohn7ugowtrgx7erfbo36ibyhay

BO-ALLCNN: Bayesian-Based Optimized CNN for Acute Lymphoblastic Leukemia Detection in Microscopic Blood Smear Images

Ghada Atteia, Amel A. Alhussan, Nagwan Abdel Samee
2022 Sensors  
The Bayesian optimization technique adopts an informed iterative procedure to search the hyperparameter space for the optimal set of network hyperparameters that minimizes an objective error function.  ...  In this study, a new Bayesian-based optimized convolutional neural network (CNN) is introduced for the detection of ALL in microscopic smear images.  ...  Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/s22155520 pmid:35898023 pmcid:PMC9329984 fatcat:zml2hf73ofebjcxgklfgqnxaru

Hyperparameter optimization using custom genetic algorithm for classification of benign and malicious traffic on internet of things–23 dataset

Karthikayini Thavasimani, Nuggehalli Kasturirangan Srinath
2022 International Journal of Power Electronics and Drive Systems (IJPEDS)  
In our paper, we propose a custom genetic algorithm to auto-tune the hyperparameters of the deep learning sequential model to classify benign and malicious traffic from internet of things-23 dataset captured  ...  Hyperparameter optimization is one of the main challenges in deep learning despite its successful exploration in many areas such as image classification, speech recognition, natural language processing  ...  Repeat the process for twenty generations to obtain the optimal child. The final output is the model's optimal hyperparameters of the neural network.  ... 
doi:10.11591/ijece.v12i4.pp4031-4041 fatcat:sv3czda66fh4nad3nnrz5talw4

Multi-level CNN for lung nodule classification with Gaussian Process assisted hyperparameter optimization [article]

Miao Zhang, Huiqi Li, Juan Lyu, Sai Ho Ling, Steven Su
2019 arXiv   pre-print
Bayesian optimization has been recently introduced for the automatically searching of optimal hyperparameter configurations of DNNs.  ...  Our algorithm searches the surrogate for optimal setting via hyperparameter importance based evolutionary strategy, and the experiments demonstrate our algorithm outperforms manual tuning and well-established  ...  hyperparaneter optimization of deep neural networks.  ... 
arXiv:1901.00276v1 fatcat:xrfnj5mqsjaipjfw3deywxofue
« Previous Showing results 1 — 15 out of 76,266 results