Filters








18,577 Hits in 2.8 sec

Incremental Training of Deep Convolutional Neural Networks [article]

Roxana Istrate, Adelmo Cristiano Innocenza Malossi, Costas Bekas, Dimitrios Nikolopoulos
2018 arXiv   pre-print
We propose an incremental training method that partitions the original network into sub-networks, which are then gradually incorporated in the running network during the training process.  ...  We demonstrate that our incremental approach reaches the reference network baseline accuracy.  ...  In this paper we present our contribution towards gradually training deep state-of-the-art convolutional neural networks (CNNs) with no loss in accuracy.  ... 
arXiv:1803.10232v1 fatcat:3jvg3er6x5hdpdde7rv6a5rhxm

Learning Automata based Incremental Learning Method for Deep Neural Networks

Haonan Guo, Shilin Wang, Jianxun Fan, Shenghong Li
2019 IEEE Access  
In this paper, we proposed an effective incremental training method based on learning automata for deep neural networks.  ...  The experiments on MNIST and CIFAR-100 demonstrated that our method can be implemented for deep neural models in a long sequence of incremental training stages and can achieve superior performance than  ...  RESULTS FOR CONVOLUTIONAL NEURAL NETWORK We evaluate the efficiency of our proposed method for convolutional neural network on CIFAR-100 dataset.  ... 
doi:10.1109/access.2019.2907645 fatcat:4vmpbf7xzvf5zdkr4vxddsjkgu

Deep Incremental Boosting [article]

Alan Mosca, George D Magoulas
2017 arXiv   pre-print
We draw inspiration from Transfer of Learning approaches to reduce the start-up time to training each incremental Ensemble member.  ...  This paper introduces Deep Incremental Boosting, a new technique derived from AdaBoost, specifically adapted to work with Deep Learning methods, that reduces the required training time and improves generalisation  ...  Concluding Remarks In this paper we have introduced a new algorithm, called Deep Incremental Boosting, which combines the power of AdaBoost, Deep Neural Networks and Transfer of Learning principles, in  ... 
arXiv:1708.03704v1 fatcat:pyr5jzchcfedhpliiqjp62ax4m

Learning Compact Convolutional Neural Networks with Nested Dropout [article]

Chelsea Finn, Lisa Anne Hendricks, Trevor Darrell
2015 arXiv   pre-print
We explore the impact of nested dropout on the convolutional layers in a CNN trained by backpropagation, investigating whether nested dropout can provide a simple and systematic way to determine the optimal  ...  However, it has only been applied to training fully-connected autoencoders in an unsupervised setting.  ...  Additionally, we hope that, in the future, ordering parameters may provide insights into optimization of deep convolutional neural networks and how the network architecture impacts performance.  ... 
arXiv:1412.7155v4 fatcat:k43olm4x3ra5fj4t4qf6cwoega

Handwriting Recognition using Deep Learning based Convolutional Neural Network

2019 International journal of recent technology and engineering  
This paper focuses on exploring convolutional neural networks (CNN) which is deep learning based for the recognition of handwritten script.  ...  Handwriting is a learned skill that had been an excellent means of communication and documentation for thousands of years.  ...  In this, Convolutional Neural Network is used for on IAM Handwriting Dataset for training and testing.  ... 
doi:10.35940/ijrte.d7811.118419 fatcat:hm7mrih6yrgmhb6tdfmmj5qpzi

A Convolutional Neural Network with Incremental Learning

Haruka Tomimori, Kui-Ting Chen, Takaaki Baba
2017 Journal of Signal Processing  
Nowadays, a convolutional neural network (CNN) is considered as a deep learning method for image and voice recognition.  ...  To train such a huge neural network by computers, a tremendous amount of training time is required.  ...  Acknowledgment This work was supported by Japan Society for the Promotion of Science (JSPS) through a Grant-in-Aid for Scientific Research (No. 26280017) and a Grant-in-Aid for Young Scientists (No. 15K21435  ... 
doi:10.2299/jsp.21.155 fatcat:76ej3xdnlfcfxamfltd77lzjay

Rethinking the Number of Channels for the Convolutional Neural Network [article]

Hui Zhu, Zhulin An, Chuanguang Yang, Xiaolong Hu, Kaiqiang Xu, Yongjun Xu
2019 arXiv   pre-print
In particular, our method is suitable for exploring the number of channels of almost any convolutional neural network rapidly.  ...  Latest algorithms for automatic neural architecture search perform remarkable but few of them can effectively design the number of channels for convolutional neural networks and consume less computational  ...  for convolutional neural networks  ... 
arXiv:1909.01861v1 fatcat:aws36vcelray7ptlz75hl6xdg4

Dual Memory Architectures for Fast Deep Learning of Stream Data via an Online-Incremental-Transfer Strategy [article]

Sang-Woo Lee, Min-Oh Heo, Jiwon Kim, Jeonghee Kim, Byoung-Tak Zhang
2015 arXiv   pre-print
Unfortunately, deep neural network learning through classical online and incremental methods does not work well in both theory and practice.  ...  The online learning of deep neural networks is an interesting problem of machine learning because, for example, major IT companies want to manage the information of the massive data uploaded on the web  ...  We run various size of deep convolutional neural networks for each dataset using the demo code in MatConvNet, which is a MATLAB toolbox of convolutional neural networks (Vedaldi & Lenc, 2014) .  ... 
arXiv:1506.04477v1 fatcat:ryij5slnsjhi7acauu3bhnfkry

Bayesian Incremental Learning for Deep Neural Networks [article]

Max Kochurov, Timur Garipov, Dmitry Podoprikhin, Dmitry Molchanov, Arsenii Ashukha, Dmitry Vetrov
2018 arXiv   pre-print
Particularly in the case of deep neural networks, it may be too expensive to train the model from scratch each time, so one would rather use a previously learned model and the new data to improve performance  ...  However, deep neural networks are prone to getting stuck in a suboptimal solution when trained on only new data as compared to the full dataset.  ...  Louizos & Welling (2017) successfully employed it to train Bayesian deep neural networks.  ... 
arXiv:1802.07329v3 fatcat:4uwh6a77jfclpoumihyt2m5aiu

Editorial: Special Issue on Compact Deep Neural Networks With Industrial Applications

Lixin Fan, Diana Marculescu, Werner Bailer, Yurong Chen
2020 IEEE Journal on Selected Topics in Signal Processing  
Their recent success is based on the feasibility of processing much larger and complex neural networks (deep neural networks, DNNs) than in the past, and the availability of large-scale training data sets  ...  In "Structured Pruning for Efficient Convolutional Neural Networks via Incremental Regularization", Wang et al. propose a novel regularization-based pruning method, named IncReg, to incrementally assign  ... 
doi:10.1109/jstsp.2020.3006323 fatcat:d75ni7ocajb4pemovq2l3ton4i

A Study of incremental Learning model using deep neural network

2021 International Journal of Advanced Trends in Computer Science and Engineering  
Deep learning has arrived with a great number of advances in the research of machine learning and its models.  ...  is computationally costly therefore to avoid re-training the model, we add the new samples on the previously learnt features from the pre- trained model called Incremental Learning.  ...  Therefore incremental training of a deep convolution neural network (CNN) model as new classes are added to the existing data [2] .  ... 
doi:10.30534/ijatcse/2021/281022021 fatcat:k3i3q5j77bcjrdnlzmhjoon4yy

LesionSeg: Semantic segmentation of skin lesions using Deep Convolutional Neural Network [article]

Dhanesh Ramachandram, Terrance DeVries
2017 arXiv   pre-print
Our approach is based on a Fully Convolutional Network architecture which is trained end to end, from scratch, on a limited dataset.  ...  receptive field without increasing the number of parameters, (ii) the use of network-in-network 1×1 convolution layers to add capacity to the network and (iii) state-of-art super-resolution upsampling  ...  A convolutional neural network can be adapted to perform perform semantic segmentation by replacing the top layer 1 of a classification network into a convolutional layer.  ... 
arXiv:1703.03372v3 fatcat:bo342dmijjfxlbfi74tx5zvo2i

Class-Incremental Learning Based on FeatureExtraction of CNN with Optimized Softmax and One-Class Classifiers

First Xin Ye, Second Qiuyu Zhu
2019 IEEE Access  
With the development of deep convolutional neural networks in recent years, the network structure has become more and more complicated and varied, and there are very good results in pattern recognition  ...  This paper proposes an incremental learning algorithm based on convolutional neural network and support vector data description.  ...  These phantom samples are used to train new deep networks together with incremental samples, achieving better class incremental training effects.  ... 
doi:10.1109/access.2019.2904614 fatcat:egnckzbxuvgz5cz5ntp2sl2phu

DeepObfuscation: Securing the Structure of Convolutional Neural Networks via Knowledge Distillation [article]

Hui Xu, Yuxin Su, Zirui Zhao, Yangfan Zhou, Michael R. Lyu, Irwin King
2018 arXiv   pre-print
In particular, we focus on obfuscating convolutional neural networks (CNN), a widely employed type of deep learning architectures for image recognition.  ...  Although these networks are very deep with tens or hundreds of layers, we can simulate them in a shallow network including only five or seven convolutional layers.  ...  Our study focuses on a prevalent type of deep learning networks, convolutional neural networks (CNN).  ... 
arXiv:1806.10313v1 fatcat:juzxi4y27rhvvmdesie2zfnbwq

Incremental Learning of Multi-tasking Networks for Aesthetic Radar Map Prediction

Xin Jin, Xinghui Zhou, Xiaodong Li, Xiaokun Zhang, Hongbo Sun, Xiqiao Li, Ruijun Liu
2019 IEEE Access  
Recent work shows that deep convolution neural network can be used to extract image features and further evaluate the total score of images, and the method of evaluation are lacking of sufficient detailed  ...  In this paper, we propose a multi-task convolution neural network with more incremental features. We show the results in the way of a hexagon map, which is called aesthetic radar map.  ...  quality assessment of images is not a hot topic in the field of computer vision, but in photography. A multi-task regression model is a deep convolutional neural network with sigmoid.  ... 
doi:10.1109/access.2019.2958119 fatcat:vrqmh577ivhinimnebcflctqf4
« Previous Showing results 1 — 15 out of 18,577 results