A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Hyperparameter Transfer Across Developer Adjustments
[article]
2020
arXiv
pre-print
In this work, we remedy this situation and propose a new research framework: hyperparameter transfer across adjustments (HT-AA). ...
This question poses a challenging problem, as developer adjustments can change which hyperparameter settings perform well, or even the hyperparameter search space itself. ...
HYPERPARAMETER TRANSFER ACROSS ADJUSTMENTS After presenting a broad introduction to the topic, we now provide a detailed description of hyperparameter transfer across developer adjustments (HT-AA). ...
arXiv:2010.13117v1
fatcat:7kelwh3uwvceffakz3c23oqqwa
Efficacy of the Image Augmentation Method using CNN Transfer Learning in Identification of Timber Defect
2022
International Journal of Advanced Computer Science and Applications
According to the results, the ResNet50 algorithm, which has its basis in the transfer learning methodology, outclasses other CNN algorithms (ShuffleNet, AlexNet, MobileNetV2, NASNetMobile, and GoogLeNet ...
an augmentation methodology not just addresses the issue of a limited dataset but also enhances CNN classification output by 5.78% with the support of T-test that demonstrates a significant difference across ...
II CLASSIFICATION PERFORMANCE OF CNN ALGORITHMS ACROSS TIMBER SPECIES WITH MULTIPLE HYPERPARAMETERS SETTINGS. ...
doi:10.14569/ijacsa.2022.0130514
fatcat:ayz55suk75dwzdgjmmmbxp3soa
AgroAId: A Mobile App System for Visual Classification of Plant Species and Diseases Using Deep Learning and TensorFlow Lite
2022
Informatics
learning approach, and hyperparameter optimizations. ...
In this paper, we develop a mobile plant care support system ("AgroAId"), which incorporates computer vision technology to classify a plant's [species–disease] combination from an input plant leaf image ...
and adjusted base network retraining portions -developed using the transfer learning approaches outlined in [20] -to improve on the accuracies of the referenced models. ...
doi:10.3390/informatics9030055
fatcat:ciiu3vwopjbapkympneci4bpdy
Transfer Learning versus Multiagent Learning regarding Distributed Decision-Making in Highway Traffic
2018
International Joint Conference on Artificial Intelligence
In the first step traffic agents are trained by means of a deep reinforcement learning approach, being deployed inside an elitist evolutionary algorithm for hyperparameter search. ...
The resulting architectures and training parameters are then utilized in order to either train a single autonomous traffic agent and transfer the learned weights onto a multi-agent scenario or else to ...
Stefan Elser from Research and Development, as well as the whole Data Science Team at ZF Friedrichshafen AG, for supporting this research. ...
dblp:conf/ijcai/SchuteraG0R18
fatcat:kpbploe6nvaqvnwf6qzlbhgtva
Transfer Learning versus Multi-agent Learning regarding Distributed Decision-Making in Highway Traffic
[article]
2018
arXiv
pre-print
In the first step traffic agents are trained by means of a deep reinforcement learning approach, being deployed inside an elitist evolutionary algorithm for hyperparameter search. ...
The resulting architectures and training parameters are then utilized in order to either train a single autonomous traffic agent and transfer the learned weights onto a multi-agent scenario or else to ...
Stefan Elser from Research and Development, as well as the whole Data Science Team at ZF Friedrichshafen AG, for supporting this research. ...
arXiv:1810.08515v1
fatcat:5geimpxlafddrhifkfadrdom7e
Hyperparameter Optimization with Neural Network Pruning
[article]
2022
arXiv
pre-print
As service development using deep learning models has gradually become competitive, many developers highly demand rapid hyperparameter optimization algorithms. ...
Since the deep learning model is highly dependent on hyperparameters, hyperparameter optimization is essential in developing deep learning model-based applications, even if it takes a long time. ...
This trend is consistent across the three datasets. ...
arXiv:2205.08695v1
fatcat:zfdkblp2mnc7fpnauxxkpxn5ya
Transfer learning reveals sequence determinants of regulatory element accessibility
[article]
2022
bioRxiv
pre-print
Here, we develop ChromTransfer, a transfer learning method that uses a pre-trained, cell-type agnostic model of open chromatin regions as a basis for fine-tuning on regulatory sequences. ...
Hyperparameters were adjusted to yield the best performance on the validation set. ...
As a step towards establishing transfer learning for modeling the regulatory code, we here develop ChromTransfer, a transfer learning scheme for single-task modeling of the DNA sequence determinants of ...
doi:10.1101/2022.08.05.502903
fatcat:b5454nvkgre2pn7loa3lrm26qi
Towards empirical force fields that match experimental observables
[article]
2020
arXiv
pre-print
to provide transferable information. ...
Validation of the selected optimized model against new data that, importantly, has not been used to adjust neither parameters nor hyperparameters, is best practice in this case as well. λ (n) E CV (1) ...
arXiv:2004.01630v4
fatcat:erviaidwmrbpfbl2uisrzt565e
Predicting Reaction Conditions from Limited Data through Active Transfer Learning
2022
Chemical Science
Transfer and active learning have the potential to accelerate the development of new chemical reactions, using prior data and new experiments to inform models that adapt to the target area... ...
Moreover, with Please do not adjust margins Please do not adjust margins subsequent collection of target reaction data, different hyperparameter choices may be favorable for the source model. ...
These Please do not adjust margins Please do not adjust margins transfer ROC-AUC scores were compared with scores of models that were prepared appropriately. ...
doi:10.1039/d1sc06932b
pmid:35756521
pmcid:PMC9172577
fatcat:yp2qs24i4ngnlj5op4uzs65tk4
Transfer Learning Across Patient Variations with Hidden Parameter Markov Decision Processes
[article]
2016
arXiv
pre-print
We update the HiP-MDP framework and extend it to more robustly develop personalized medicine strategies for HIV treatment. ...
Hidden Parameter Markov Decision Processes (HiP-MDPs) tackle this transfer-learning problem by embedding these tasks into a low-dimensional space. ...
In Sec. 3 we formalize the adjustments to the HiP-MDP framework and in Sec. 5 we present the performance of the adjusted HiP-MDP on developing personalized treatment strategies within HIV simulators. ...
arXiv:1612.00475v1
fatcat:o3tl6244gzbytod3g6boewcbs4
Width Transfer: On the (In)variance of Width Optimization
[article]
2021
arXiv
pre-print
We show that width transfer works well across various width optimization algorithms and networks. ...
In this work, we propose width transfer, a technique that harnesses the assumptions that the optimized widths (or channel counts) are regular across sizes and depths. ...
Given width optimization is a fast-developing field, we study the transferability in the solution space to have a more general result. ...
arXiv:2104.13255v1
fatcat:fjboanmix5gmzll57m4ms6yo7u
Towards Automatic Actor-Critic Solutions to Continuous Control
[article]
2021
arXiv
pre-print
Empirically, we show that our agent outperforms well-tuned hyperparameter settings in popular benchmarks from the DeepMind Control Suite. ...
However, these algorithms rely on a number of design tricks and hyperparameters, making their application to new domains difficult and computationally expensive. ...
The highest-fitness members are randomly paired with the lowest-fitness members to transfer and then perturb their hyperparameter values. Network parameters and optimizer states are also transferred. ...
arXiv:2106.08918v2
fatcat:2hy6rrfmoffx3be5xsdgr3krjq
A flexible transfer learning framework for Bayesian optimization with convergence guarantee
2019
Expert systems with applications
We provide a mechanism to compute the noise level from the data to automatically adjust for different relatedness between the source and target tasks. ...
We propose a novel transfer learning method for Bayesian optimization where we leverage the knowledge from an already completed source optimization task for the optimization of a target task. ...
Addressing this, we develop a new framework for transfer learning. ...
doi:10.1016/j.eswa.2018.08.023
fatcat:jtoe44k2irfidltug5om5eawfe
Adversarial Learning for Zero-Shot Stance Detection on Social Media
[article]
2021
arXiv
pre-print
In addition, we extend zero-shot stance detection to new topics, highlighting future directions for zero-shot transfer. ...
In this work, we propose a new model for zero-shot stance detection on Twitter that uses adversarial learning to generalize across topics. ...
Hyperparameters We tune the hyperparameters for our adversarial model using uniform sampling on the development set with 20 search trials. ...
arXiv:2105.06603v1
fatcat:qzn7hhuzp5efpintjpm743adya
Automated Detection of Greenhouse Structures Using Cascade Mask R-CNN
2022
Applied Sciences
Our proposed model is regional-based because it was optimized for the Republic of Korea via transfer learning and hyperparameter tuning, which improved the efficiency of the automated detection of greenhouse ...
Similarly, the F1-score of the proposed Cascade Mask R-CNN model was 62.07, which outperformed those of the baseline mask R-CNN and the Mask R-CNN with hyperparameter tuning and transfer learning considered ...
Model mAP F1-Score baseline Mask R-CNN 70.77 52.33 Mask R-CNN (hyperparameter tuning and transfer learning) 81.70 59.13 Cascade Mask R-CNN (hyperparameter tuning and transfer learning) 83.60 62.07 ...
doi:10.3390/app12115553
fatcat:yllauvmdqbealf6fodcrjvrppu
« Previous
Showing results 1 — 15 out of 12,153 results