Filters








16,224 Hits in 5.8 sec

On Functional Test Generation for Deep Neural Network IPs [article]

Bo Luo, Yu Li, Lingxiao Wei, Qiang Xu
2019 arXiv   pre-print
Machine learning systems based on deep neural networks (DNNs) produce state-of-the-art results in many applications.  ...  No doubt to say, it is essential for DNN IP vendors to provide test cases for functional validation without leaking their parameters to IP users.  ...  INTRODUCTION Artificial intelligence (AI) systems based on deep neural networks (DNNs) have achieved great success in many areas such as computer vision, speech recognition and natural language processing  ... 
arXiv:1911.11550v1 fatcat:okdm5ch4s5a7pnsgyz3w5e37mq

Bootstrapping Deep Neural Networks from Approximate Image Processing Pipelines [article]

Kilho Son and Jesse Hostetler and Sek Chai
2019 arXiv   pre-print
We intend to replace parts or all of a target pipeline with deep neural networks to achieve benefits such as increased accuracy or reduced computational requirement.  ...  We show experimentally that despite the noise introduced by automated labeling and only using a very small initially labeled data set, the trained deep neural networks can achieve similar or even better  ...  Figure 4 shows pipeline and deep neural networks performances on test data according to computational requirement. The accuracy of the pipeline presents 86% on test data.  ... 
arXiv:1811.12108v2 fatcat:hbxww2u2szc7xeipfyzrv5zp3y

Deep-Learning-Based Wi-Fi Indoor Positioning System Using Continuous CSI of Trajectories

Zhongfeng Zhang, Minjae Lee, Seungwon Choi
2021 Sensors  
To fully exploit the trajectory CSI's spatial and temporal information, the proposed IPS employs a deep learning network of a one-dimensional convolutional neural network-long short-term memory (1DCNN-LSTM  ...  We verified that the proposed IPS based on the trajectory CSI far outperforms the state-of-the-art IPS based on the CSI collected from stationary locations through extensive experimental tests and computer  ...  Acknowledgments: The authors would like to thank the reviewers of this paper for their constructive comments. Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/s21175776 pmid:34502668 pmcid:PMC8434353 fatcat:nfbrr2wviff2bns5lsiqmws3wm

SafetyNets: Verifiable Execution of Deep Neural Networks on an Untrusted Cloud [article]

Zahra Ghodsi, Tianyu Gu, Siddharth Garg
2017 arXiv   pre-print
Our empirical results on three- and four-layer deep neural networks demonstrate the run-time costs of SafetyNets for both the client and server are low.  ...  Specifically, SafetyNets develops and implements a specialized interactive proof (IP) protocol for verifiable execution of a class of deep neural networks, i.e., those that can be represented as arithmetic  ...  deep neural networks that we develop in this paper.  ... 
arXiv:1706.10268v1 fatcat:3livqjsxqrdwphadwyq4rqo2ue

DeepSigns: A Generic Watermarking Framework for IP Protection of Deep Learning Models [article]

Bita Darvish Rouhani and Huili Chen and Farinaz Koushanfar
2018 arXiv   pre-print
Proof-of-concept evaluations on MNIST, and CIFAR10 datasets, as well as a wide variety of neural network architectures including Wide Residual Networks, Convolution Neural Networks, and Multi-Layer Perceptrons  ...  DeepSigns, for the first time, introduces a generic watermarking methodology that can be used for protecting DL owner's IP rights in both white-box and black-box settings, where the adversary may or may  ...  We focus, in particular, on classification tasks using deep neural networks.  ... 
arXiv:1804.00750v2 fatcat:2n4gb6gt2zenlbhoaaa2tvtesq

A Novel Hybrid Deep Learning Model for Sugar Price Forecasting Based on Time Series Decomposition

Jinlai Zhang, Yanmei Meng, Jin Wei, Jie Chen, Johnny Qin, Francesco Lolli
2021 Mathematical Problems in Engineering  
Tree of Parzen Estimators (TPEs) for sugar price forecasting.  ...  Sugar price forecasting has attracted extensive attention from policymakers due to its significant impact on people's daily lives and markets.  ...  function is a function that runs on the neurons of the neural network and is responsible for mapping the input of the neuron to the output [36] .  ... 
doi:10.1155/2021/6507688 fatcat:z2z6bfsdmvcfffhxcewwzinehm

Information flows of diverse autoencoders [article]

Sungyeop Lee, Junghyo Jo
2021 arXiv   pre-print
These types of autoencoders exhibited perfect generalization ability for test data without requiring the compression phase.  ...  However, we found that the compression phase is not universally observed in different species of autoencoders, including variational autoencoders, that have special constraints on network weights or manifold  ...  We used the same deep network structure like the deep AE and VAE for LAE. The deep LAE can also learn the training data of MNIST and generalize to reproduce the test data as well ( figure 4(d) ).  ... 
arXiv:2102.07402v2 fatcat:bhfae3zfm5fwbgpdflz7nucl74

Deep Low-Density Separation for Semi-supervised Classification [chapter]

Michael C. Burkhart, Kyle Shan
2020 Lecture Notes in Computer Science  
We describe it in detail and discuss why low-density separation may better suited for ssl on neural network-based embeddings than graph-based algorithms.  ...  For complex and high-dimensional data, neural networks can learn feature embeddings to which traditional ssl methods can then be applied in what we call hybrid methods.  ...  [20] described using a deep neural network to embed features for Gaussian process regression, though they use a probabilistic framework for ssl and consider a completely different objective function  ... 
doi:10.1007/978-3-030-50420-5_22 fatcat:xdvl6f2qgbfindvqqldt6aycse

Biological batch normalisation: How intrinsic plasticity improves learning in deep neural networks

Nolan Peter Shaw, Tyler Jackson, Jeff Orchard
2020 PLoS ONE  
We demonstrate that the IP rule improves learning in deep networks, and provides networks with considerable robustness to increases in synaptic learning rates.  ...  An analysis demonstrates that the IP rule results in neuronal information potential similar to that of Infomax, when tested on a fixed input distribution.  ...  We then test the impact of our intrinsic rule, which we dub "IP", on learning in deep neural networks.  ... 
doi:10.1371/journal.pone.0238454 pmid:32966302 pmcid:PMC7511202 fatcat:ausvf2mvobh3hiwf2f5veymbca

Variational Implicit Processes [article]

Chao Ma, Yingzhen Li, José Miguel Hernández-Lobato
2019 arXiv   pre-print
IPs are therefore highly flexible implicit priors over functions, with examples including data simulators, Bayesian neural networks and non-linear transformations of stochastic processes.  ...  Experiments show that VIPs return better uncertainty estimates and lower errors over existing inference methods for challenging models such as Bayesian neural networks, and Gaussian processes.  ...  Acknowledgements We thank Jiri Hron, Rich Turner and Cheng Zhang for discussions. Chao Ma thanks Microsoft Research donation for supporting his research.  ... 
arXiv:1806.02390v2 fatcat:t3yn25i3frff3plo7mjbgfnd4u

GADAM: Genetic-Evolutionary ADAM for Deep Neural Network Optimization [article]

Jiawei Zhang, Fisher B. Gouza
2019 arXiv   pre-print
GADAM learns deep neural network models based on a number of unit models generations by generations: it trains the unit models with Adam, and evolves them to the new generations with genetic algorithm.  ...  Deep neural network learning can be formulated as a non-convex optimization problem. Existing optimization algorithms, e.g., Adam, can learn the models fast, but may get stuck in local optima easily.  ...  neural networks with non-convex objective functions.  ... 
arXiv:1805.07500v2 fatcat:5cgcjt47dzffxp33i5gw2czzae

Bioluminescence Tomography Based on One-Dimensional Convolutional Neural Networks

Jingjing Yu, Chenyang Dai, Xuelei He, Hongbo Guo, Siyu Sun, Ying Liu
2021 Frontiers in Oncology  
In order to improve the reconstruction accuracy of positioning and reconstruction efficiency, this paper presents a deep-learning optical reconstruction method based on one-dimensional convolutional neural  ...  networks (1DCNN).  ...  In this study, a deep-learning method based on onedimensional convolutional neural networks (1DCNN) is proposed for BLT.  ... 
doi:10.3389/fonc.2021.760689 pmid:34733793 pmcid:PMC8558399 fatcat:q4vohjsw5jezbox43bzjxjrbbq

A Discrete-event-based Simulator for Deep Learning at Edge [article]

Xiaoyan Liu, Zhiwei Xu, Yana Qin, Jie Tian
2021 arXiv   pre-print
Specifically, it enable simulations as an environment for deep learning. Our framework is generic and can be used in various deep learning problems before the deep learning model is deployed.  ...  Novel smart environments, such as smart home, smart city, and intelligent transportation, are driving increasing interest in deploying deep neural networks (DNN) at edge devices.  ...  The Testing module is a module of the testing process, which contains tools for testing in other learning tasks. The learning module is used for deep learning on the edge network.  ... 
arXiv:2112.00952v1 fatcat:2hv6bthezvhvpj2omaamhlqudy

Digital Passport: A Novel Technological Strategy for Intellectual Property Protection of Convolutional Neural Networks [article]

Lixin Fan and KamWoh Ng and Chee Seng Chan
2019 arXiv   pre-print
In order to prevent deep neural networks from being infringed by unauthorized parties, we propose a generic solution which embeds a designated digital passport into a network, and subsequently, either  ...  paralyzes the network functionalities for unauthorized usages or maintain its functionalities in the presence of a verified passport.  ...  We believe this paper puts forward a new research direction for the study of deep neural networks IP protection which is urgently needed.  ... 
arXiv:1905.04368v1 fatcat:fp24tjqsg5btthbjdtnwgpydva

An equation-of-state-meter of QCD transition from deep learning [article]

Long-Gang Pang, Kai Zhou, Nan Su, Hannah Petersen, Horst Stöcker and Xin-Nian Wang
2017 arXiv   pre-print
Supervised learning with a deep convolutional neural network is used to identify the QCD equation of state (EoS) employed in relativistic hydrodynamic simulations of heavy-ion collisions from the simulated  ...  High-level correlations of $\rho(p_T,\Phi)$ learned by the neural network act as an effective "EoS-meter" in detecting the nature of the QCD transition.  ...  While for the testing on IEBE-VISHNU + MC-Glauber (testing GROUP 1) and CLVisc + IP-Glasma (testing GROUP 2), the prediction Loss function l(θ) is the difference between the true value y (from the input  ... 
arXiv:1612.04262v3 fatcat:lwcdvhabtfgq3mxta73ghr767e
« Previous Showing results 1 — 15 out of 16,224 results