3,021 Hits in 6.9 sec

Shallow Neural Network can Perfectly Classify an Object following Separable Probability Distribution [article]

Youngjae Min, Hye Won Chung
2019 arXiv   pre-print
This paper constructs shallow sigmoid-type neural networks that achieve 100% accuracy in classification for datasets following a linear separability condition.  ...  Moreover, the constructed neural network guarantees perfect classification for any datasets sampled from a separable probability distribution.  ...  For the broader class of datasets, we design a 4-layer neural network that can perfectly classify any datasets following the generalized separability. Definition 3.1: Let X ⊂ R d and Y=[1 : c].  ... 
arXiv:1904.09109v1 fatcat:oltdaw3b3bd6tixmwowvh5sx4a

How benign is benign overfitting? [article]

Amartya Sanyal, Puneet K Dokania, Varun Kanade, Philip H.S. Torr
2020 arXiv   pre-print
We investigate two causes for adversarial vulnerability in deep neural networks: bad data and (poorly) trained models.  ...  Standard training procedures bias neural networks towards learning "simple" classification boundaries, which may be less robust than more complex ones.  ...  We train three different neural networks with ReLU activations, a shallow network (Shallow NN) with 2 layers and 100 neurons in each layer, a shallow network with 2 layers and 1000 neurons in each layer  ... 
arXiv:2007.04028v1 fatcat:jib6hqwpa5cn5ngj7xbejjsq6q

Efficient Antihydrogen Detection in Antimatter Physics by Deep Learning [article]

Peter Sadowski, Balint Radics, Ananya, Yasunori Yamazaki, Pierre Baldi
2017 arXiv   pre-print
the fundamental CPT symmetry and antigravity effects require the efficient detection of antihydrogen annihilation events, which is performed using highly granular tracking detectors installed around an  ...  In this work, we use a simple neural network model with a single input R, one hidden layer of 100 hyperbolic tangent units, and a logistic output unit that returns a probability of an event being an an  ...  entropy error so that the output can be interpreted as a probability.  ... 
arXiv:1706.01826v1 fatcat:2r5fone4zjg7rpqxajs2czdubu

Dynamic Representations Toward Efficient Inference on Deep Neural Networks by Decision Gates [article]

Mohammad Saeed Shafiee, Mohammad Javad Shafiee, Alexander Wong
2019 arXiv   pre-print
The proposed d-gate modules can be integrated with any deep neural network and reduces the average computational cost of the deep neural networks while maintaining modeling accuracy.  ...  While deep neural networks extract rich features from the input data, the current trade-off between depth and computational cost makes it difficult to adopt deep neural networks for many industrial applications  ...  hyperplane in scenarios where the data is not perfectly separable.  ... 
arXiv:1811.01476v4 fatcat:i2uxejdo2ndkhnbeqhuhbogjbe

Gas chimney detection based on improving the performance of combined multilayer perceptron and support vector classifier

H. Hashemi, D. M. J. Tax, R. P. W. Duin, A. Javaherian, P. de Groot
2008 Nonlinear Processes in Geophysics  
In this paper, we propose a method for finding an optimal classifier with the help of a statistical feature ranking technique and combining different classifiers.  ...  </strong> Seismic object detection is a relatively new field in which 3-D bodies are visualized and spatial relationships between objects of different origins are studied in order to extract geologic information  ...  The minimum combining rule is a good choice, because it preserves the soft ability of a neural network in an appropriate manner.  ... 
doi:10.5194/npg-15-863-2008 fatcat:5gkepnnyrvc2jovwlnqcvleyvq

Shallow Transits—Deep Learning. II. Identify Individual Exoplanetary Transits in Red Noise using Deep Learning

Elad Dvash, Yam Peleg, Shay Zucker, Raja Giryes
2022 Astronomical Journal  
neural networks.  ...  In a previous paper, we introduced a deep learning neural network that should be able to detect the existence of very shallow periodic planetary transits in the presence of red noise.  ...  Blue: after training the classifier separately from the segmentation network; red: after training the classifier together with the segmentation network.  ... 
doi:10.3847/1538-3881/ac5ea2 fatcat:wz4bpchqr5at5khinxiqqlptzq

Improvement of the Deep Forest Classifier by a Set of Neural Networks

Lev V. Utkin, Kirill D. Zhuk
2020 Informatica (Ljubljana, Tiskana izd.)  
The networks are trained in accordance with a loss function which measures the classification error. Every neural network can be viewed as a non-linear function of probabilities of a class.  ...  The main idea underlying NeuRF is to combine the class probability distributions produced by decision trees by means of a set of neural networks with shared parameters.  ...  shallow neural networks with shared weights.  ... 
doi:10.31449/inf.v44i1.2740 fatcat:wkamdqbys5cltilic5csn4pkcy

Deep Learning and Its Application to LHC Physics

Dan Guest, Kyle Cranmer, Daniel Whiteson
2018 Annual Review of Nuclear and Particle Science  
The connections between machine learning and high energy physics data analysis are explored, followed by an introduction to the core concepts of neural networks, examples of the key results demonstrating  ...  Machine learning has played an important role in the analysis of high-energy physics data for decades.  ...  However, an effective shallow network may require an enormous number of nodes in the hidden layer, and in practice, shallow neural networks often failed to discover useful functions from high-dimensional  ... 
doi:10.1146/annurev-nucl-101917-021019 fatcat:4ll2ex624jcutgimi5w7wya2bq

Frame‐by‐frame annotation of video recordings using deep neural networks

Alexander M. Conway, Ian N. Durbach, Alistair McInnes, Robert N. Harris
2021 Ecosphere  
We demonstrate an approach that combines a standard CNN summarizing each video frame with a recurrent neural network (RNN) that models the temporal component of video.  ...  Frame-by-frame annotation of video recordings using deep neural networks. Ecosphere 12(3): Abstract.  ...  Trinh et al. (2016) combined neural network architectures to detect birds flying into wind turbines from sequences of input frames, and Beery et al. (2020) combined an object detection model with two  ... 
doi:10.1002/ecs2.3384 fatcat:vsjiuk53hndmxlp54wrwfdmymq

Meta Learning for Few-Shot One-class Classification [article]

Gabriel Dahia, Maurício Pamplona Segundo
2020 arXiv   pre-print
We show how the Support Vector Data Description method can be used with our method, and also propose a simpler variant based on Prototypical Networks that obtains comparable performance, indicating that  ...  We propose a method that can perform one-class classification given only a small number of examples from the target class and none from the others.  ...  Perfectly learning f θ in the meta-training stage would map any input distribution into a space that can be correctly classified by SVDD, and would therefore not depend on the given data X nor on what  ... 
arXiv:2009.05353v2 fatcat:u2fea5hwxjggjagzauikbv3lfq

A comparison of classification techniques to support land cover and land use analysis in tropical coastal zones

Brian W. Szuster, Qi Chen, Michael Borger
2011 Applied Geography  
The overall and individual classification results of this approach were compared to the maximum likelihood classifier and the artificial neural network techniques.  ...  The medium resolution ASTER image also proved highly suited to classifying coastal landscapes with this mix of land cover types.  ...  Artificial neural networks (ANN) are a more recent non-parametric classification technique (Lu & Weng, 2007) which does not depend upon an assumption of normally distributed data (Dixon & Candade, 2008  ... 
doi:10.1016/j.apgeog.2010.11.007 fatcat:fh6heeyabbg6pm3cn5wt23jasi

Do Deep Nets Really Need to be Deep? [article]

Lei Jimmy Ba, Rich Caruana
2014 arXiv   pre-print
Our success in training shallow neural nets to mimic deeper models suggests that there probably exist better algorithms for training shallow feed-forward nets than those currently available.  ...  Moreover, in some cases the shallow neural nets can learn these deep functions using a total number of parameters similar to the original deep model.  ...  We follow the same approach as with TIMIT: An ensemble of deep CNN models is used to label CIFAR-10 images for model compression.  ... 
arXiv:1312.6184v7 fatcat:3kxn34qs5zc4vmrhiun3trib7i

Image fusion using symmetric skip autoencodervia an Adversarial Regulariser [article]

Snigdha Bhagat, S. D. Joshi, Brejesh Lall
2020 arXiv   pre-print
In order to efficiently optimize the parameters of the network, we propose an adversarial regulariser network which would perform supervised learning on the fused image and the original visual image.  ...  The residual module serves as primary building for the encoder, decoder and adversarial network, as an add on the symmetric skip connections perform the functionality of embedding the spatial characteristics  ...  This is where neural networks come handy since they learn a function that can model the data distribution.  ... 
arXiv:2005.00447v2 fatcat:3yynx7fjn5aktkuyxamyss7wpy

Evaluating model calibration in classification [article]

Juozas Vaicenavicius, David Widmann, Carl Andersson, Fredrik Lindsten, Jacob Roll, Thomas B. Schön
2019 arXiv   pre-print
In safety-critical applications, it is pivotal for a model to possess an adequate sense of uncertainty, which for probabilistic classifiers translates into outputting probability distributions that are  ...  Probabilistic classifiers output a probability distribution on target classes rather than just a class prediction.  ...  Niculescu-Mizil and Caruana (2005) used classical reliability diagrams (Murphy and Winkler 1977; Murphy and Winkler 1987) to investigate calibration of shallow neural networks in a binary classification  ... 
arXiv:1902.06977v1 fatcat:4vxwtxolwvcynpqoci5v6pzsiu

Statistical Learning in Medical Research with Decision Threshold and Accuracy Evaluation

Sumaiya Z. Sande, Loraine Seng, Jialiang Li, Ralph D'Agostino
2021 Journal of Data Science  
Deep learning neural networks like multilayer perceptron (MLP) and convolutional neural network (CNN), have been incorporated in medical diagnosis and prognosis for better health care practice.  ...  When the data is complicated and unstructured, shallow learning methods may not be suitable or feasible.  ...  Deep Learning (DL) An Artificial Neural Network (ANN; Schalkoff, 1997 ) is a computational model that is inspired by the way neural networks in the human brain process information.  ... 
doi:10.6339/21-jds1022 fatcat:kfdcxax6tvaozgkxf74c6adiha
« Previous Showing results 1 — 15 out of 3,021 results