Filters








15,049 Hits in 4.7 sec

Boolean Modeling of Neural Systems with Point-Process Inputs and Outputs. Part I: Theory and Simulations

Vasilis Z. Marmarelis, Theodoros P. Zanos, Theodore W. Berger
2009 Annals of Biomedical Engineering  
This paper presents a new modeling approach for neural systems with point-process (spike) inputs and outputs that utilizes Boolean operators (i.e. modulo 2 multiplication and addition that correspond to  ...  represent nonlinear interactions among the various lagged values of each input point-process or among lagged values of various inputs (if multiple inputs exist) as they reflect on the output.  ...  This paper introduces a new method of modeling neural systems with point-process inputs and outputs that utilizes Boolean operations (logical AND and OR operations corresponding to modulo-2 multiplication  ... 
doi:10.1007/s10439-009-9736-8 pmid:19517238 pmcid:PMC2917726 fatcat:td66bverevbrlkcjgjh6bwpxse

Boolean Modeling of Neural Systems with Point-Process Inputs and Outputs. Part II: Application to the Rat Hippocampus

Theodoros P. Zanos, Robert E. Hampson, Samuel E. Deadwyler, Theodore W. Berger, Vasilis Z. Marmarelis
2009 Annals of Biomedical Engineering  
This paper presents a pilot application of the Boolean-Volterra modeling methodology presented in the companion paper (Part I) that is suitable for the analysis of systems with point-process inputs and  ...  Summary of modeling performance for the 13 one-input, one-output cases analyzed. Cases/BE MFR input (sp/s) MFR output (sp/s) r-Value Data length (s) TP (%)/FP (%)  ...  This paper presents the results of a pilot application of a new method of modeling neural systems with point-process inputs and outputs that utilizes Boolean operations (logical AND and OR operations).  ... 
doi:10.1007/s10439-009-9716-z pmid:19499341 pmcid:PMC2917724 fatcat:myypr5idy5gwhenuutwttonyde

Quantum Cybernetics and Complex Quantum Systems Science: A Quantum Connectionist Exploration

Carlos Pedro Gonçalves
2015 NeuroQuantology  
In particular, quantum feedforward neural networks are worked upon so that major points such as the problem of computation of Boolean functions and neural network computational complexity, multiple layers  ...  Quantum cybernetics and its connections to complex quantum systems science is addressed from the perspective of quantum artificial neural networks as complex quantum computing systems.  ...  Thus, the interaction between the input and output neurons, expressed by the operator ˆg S , leads to the neural computation of the Boolean functions, so that the network yields the correct output through  ... 
doi:10.14704/nq.2015.13.1.804 fatcat:qp72k7m5mfc4ra34g4dkjz2b5q

Binary Decision Diagrams and neural networks

P. W. C. Prasad, Ali Assi, Azam Beg
2007 Journal of Supercomputing  
The formal core of the developed neural network model (NNM) is a unique matrix for the complexity estimation over a set of BDDs derived from Boolean logic expressions with a given number of variables and  ...  The proposed model is capable of predicting the maximum BDD complexity (MaxBC) and the number of product terms (NPT) in the Boolean function that corresponds to the minimum BDD complexity (MinBC).  ...  Each input of an NN corresponds to a single attribute of the system being modeled. The output of the NN is the prediction we are trying to make. Figure 3 .  ... 
doi:10.1007/s11227-006-0010-7 fatcat:3m4jgfhw6jcx7huwsixutcam5q

Quantum Cybernetics and Complex Quantum Systems Science - A Quantum Connectionist Exploration [article]

Carlos Pedro Gonçalves
2014 arXiv   pre-print
Several examples of quantum feedforward neural networks are addressed in regards to Boolean functions' computation, multilayer quantum computation dynamics, entanglement and quantum complementarity.  ...  In this way, the notion of an autonomous quantum computing system is introduced in regards to quantum artificial intelligence, and applied to quantum artificial neural networks, considered as autonomous  ...  In particular, quantum feedforward neural networks are addressed and major points such as the problem of computation of Boolean functions and neural network computational complexity, multiple layers and  ... 
arXiv:1402.1141v1 fatcat:uln7wev4urcqxkicxuscjjfmp4

Solving the linearly inseparable XOR problem with spiking neural networks

Mirela Reljan-Delaney, Julie Wall
2017 2017 Computing Conference  
Spiking Neural Networks (SNN) are third generation neural networks and are considered to be the most biologically plausible so far.  ...  The second experiment relied on the addition of receptive fields in order to filter the input.  ...  This work is interesting from the standpoint of proving the possibility that Boolean logic is acting itself out in neural systems and would merit further investigation by the process of reversed engineering  ... 
doi:10.1109/sai.2017.8252173 fatcat:pju2t6mucvetjbzx544vdbmb7a

On quantum neural networks [article]

Alexandr A. Ezhov
2021 arXiv   pre-print
The widespread in 2020 modern definition of a quantum neural network as a model or machine learning algorithm that combines the functions of quantum computing with artificial neural networks deprives quantum  ...  The early definition of a quantum neural network as a new field that combines the classical neurocomputing with quantum computing was rather vague and satisfactory in the 2000s.  ...  Khromov for his attention to the work and its constructive criticism, and also expresses heartfelt gratitude to Yuri I. Ozhigov, who encouraged the preparing this material.  ... 
arXiv:2104.07106v1 fatcat:rl4rdkel4bdbzfq6zyktnkkee4

Fuzzy modelling through logic optimization

A.F. Gobi, W. Pedrycz
2007 International Journal of Approximate Reasoning  
The introduced structure along with the learning mechanisms helps achieve high accuracy and interpretability (transparency) of the resulting model.  ...  We introduce a two-phase design process realizing adaptive logic processing in the form of structural and parametric optimization.  ...  Pedrycz) and the Natural Sciences and Engineering Research Council (NSERC) is gratefully acknowledged.  ... 
doi:10.1016/j.ijar.2006.06.026 fatcat:2ejpeknpuffffor3wrs3k7qig4

Boolean Models Guide Intentionally Continuous Information and Computation Inside the Brain

Germano Resconi
2019 Oriental journal of computer science and technology  
The steady state of the probabilities is the activation state continuous function whose maximum and minimum are the values of the Boolean function associated with the activation time of spikes of the neuron  ...  After a long time the back propagation and many other neural models overcame the big problem in some cases but not in all cases creating a lot of uncertainty.  ...  With the new revised study of natural neural network we try to model the activation state and the activation time in the open and closed channel process.  ... 
doi:10.13005/ojcst12.03.03 fatcat:4ngmdbutpvbnvktp3tvwr3c7cy

Neural Network Surgery with Sets [article]

Jonathan Raiman, Susan Zhang, Christy Dennison
2020 arXiv   pre-print
We achieve this by allowing the model to operate over discrete sets of features and use set-based operations to determine the exact relationship between inputs and outputs, and how they change across tweaks  ...  We propose a solution to automatically determine which components of a neural network model should be salvaged and which require retraining.  ...  Acknowledgement We wish to thank the anonymous reviewers for their valuable feedback and remarks, along with Jakub Pachocki and Szymon Sidor for their work on gradient-based surgery.  ... 
arXiv:1912.06719v2 fatcat:qyqp5ef37jejrin73telddgn7e

Building a Chaotic Proved Neural Network [article]

Jacques M. Bahi and Christophe Guyeux and Michel Salomon
2011 arXiv   pre-print
Several neural networks with different architectures are trained to exhibit a chaotical behavior.  ...  In this paper we establish a precise correspondence between the so-called chaotic iterations and a particular class of artificial neural networks: global recurrent multi-layer perceptrons.  ...  As said previously, a neural network is designed to model relationships between inputs and outputs.  ... 
arXiv:1101.4351v1 fatcat:4a3iywp2bzdxlhlskc34muqynu

Characterization of multiscale logic operations in the neural circuits

2021 Frontiers in Bioscience-Landmark  
Results: For this, we begin the analysis with simple neuron models to account for basic Boolean logic operations at a single neuron level and then move on to the phenomenological neuron models to explain  ...  the neural computation from the viewpoints of neural dynamics and neural coding.  ...  Conflict of interest The authors declare no conflict of interest. References  ... 
doi:10.52586/4983 pmid:34719201 fatcat:5o4a5hecxnchhmq7hdm5ficiwi

On the implementation of frontier-to-root tree automata in recursive neural networks

M. Gori, A. Kuchler, A. Sperduti
1999 IEEE Transactions on Neural Networks  
Specifically, we show that an FRAO (Mealy version) with m states, l input-output labels, and maximum rank N can be implemented by a recursive neural network with O( (log l+log m)lm log l+N log m ) units  ...  and four computational layers, i.e., without counting the input layer.  ...  Bottom: example of processing of the FRAO over a tree. Fig. 2 . 2 An example tree (upper half) being processed by a recursive neural network with one input, hidden, and output node (lower left).  ... 
doi:10.1109/72.809076 pmid:18252632 fatcat:ez2kw2zfmrcsrchf6ly6oglahy

Symbolic processing in neural networks

João Pedro Neto, Hava T. Siegelmann, J.Félix Costa
2003 Journal of the Brazilian Computer Society  
We introduce data types and show how to code and keep them inside the information flow of neural nets. Data types and control structures are part of a suitable programming language called NETDEF.  ...  These nets have a strong modular structure and a synchronization mechanism allowing sequential or parallel execution of subnets, despite the massive parallel feature of neural nets.  ...  Input / Output To handle input from the environment and output results, NETDEF uses the channel primitives with two special set of channels, IN k (linked directly with input channel u k ) and OUT k .  ... 
doi:10.1590/s0104-65002003000100005 fatcat:73waok6oz5b25bewuzarszzilu

Theory of neuromorphic computing by waves: machine learning by rogue waves, dispersive shocks, and solitons [article]

Giulia Marcucci, Davide Pierangeli, Claudio Conti
2019 arXiv   pre-print
A feed-forward three-layer model, with an encoding input layer, a wave layer, and a decoding readout, behaves as a conventional neural network in approximating mathematical functions, real-world datasets  ...  We study artificial neural networks with nonlinear waves as a computing reservoir. We discuss universality and the conditions to learn a dataset in terms of output channels and nonlinearity.  ...  If building heterogeneous deep computational systems with standard neural networks and waves provides computing advantages is an open question.  ... 
arXiv:1912.07044v1 fatcat:m5eanv7waffxbmfqrbszeskwue
« Previous Showing results 1 — 15 out of 15,049 results