Filters








2,481 Hits in 3.9 sec

Backpropagating through Markov Logic Networks

Patrick Betz, Mathias Niepert, Pasquale Minervini, Heiner Stuckenschmidt
2021 International Workshop on Neural-Symbolic Learning and Reasoning  
We integrate Markov Logic networks with deep learning architectures operating on high-dimensional and noisy feature inputs.  ...  Instead of relaxing the discrete components into smooth functions, we propose an approach that allows us to backpropagate through standard statistical relational learning components using perturbation-based  ...  Backpropagating through Markov Logic Instead of promoting a novel neural-symbolic model, the goal of this work is to combine already existing components and investigate how far we can get with a hybrid  ... 
dblp:conf/nesy/BetzNMS21 fatcat:w4v3p42eu5g67npis2alxngpf4

DeepPSL: End-to-end perception and reasoning with applications to zero shot learning [article]

Nigel P. Duffy, Sai Akhil Puranam, Sridhar Dasaratha, Karmvir Singh Phogat, Sunil Reddy Tiyyagura
2021 arXiv   pre-print
The key to our approach is to represent predicates in first-order logic using deep neural networks and then to approximately back-propagate through the HL-MRF and thus train every aspect of the first-order  ...  PSL represents first-order logic in terms of a convex graphical model -- Hinge Loss Markov random fields (HL-MRFs).  ...  Markov Logic Networks (MLN) [9, 10] and Probabilistic Soft Logic (PSL) [11, 1] map probabilistic first order logic to a Markov network.  ... 
arXiv:2109.13662v3 fatcat:cgz6m554izfozijciy3jpc5c6e

Relational Knowledge Extraction from Neural Networks

Manoel Vitor Macedo França, Artur S. d'Avila Garcez, Gerson Zaverucha
2015 Neural Information Processing Systems  
Empirical results obtained in comparison with a probabilistic model for relational learning, Markov Logic Networks, and a state-of-the-art Inductive Logic Programming system, Aleph, indicate that the proposed  ...  methodology achieves competitive accuracy results consistently in all datasets investigated, while either Markov Logic Networks or Aleph show considerably worse results in at least one dataset.  ...  CILP++ is being tested against a well-known ILP system, Aleph [23] and Markov Logic Networks (MLN's) [15] .  ... 
dblp:conf/nips/FrancaGZ15 fatcat:3gb7oaarprewnjaueqe3diwuyi

Fuzzy Logic-Based Scenario Recognition from Video Sequences

E. Elbaşi
2013 Journal of Applied Research and Technology  
are control charts, and hidden Markov models.  ...  Overlapping between events is one of the problems, hence we applied fuzzy logic technique to solve this problem. After using this method the total accuracy increased from 95.6 to 97.2.  ...  The best classification achieved by a neural network was with the multi module backpropagation neural network.  ... 
doi:10.1016/s1665-6423(13)71578-5 fatcat:fk7j3zl6vjdedbq4amfoc7pom4

Recursive Random Fields

Daniel Lowd, Pedro M. Domingos
2007 International Joint Conference on Artificial Intelligence  
We propose to overcome this by allowing the features of Markov logic networks (MLNs) to be nested MLNs. We call this representation recursive random fields (RRFs).  ...  Markov logic [Richardson and Domingos, 2006 ] makes this conjunction probabilistic, as well as the universal quantifiers directly under it, but the rest of the tree remains purely logical.  ...  Markov Logic Networks A Markov logic network (MLN) consists of a set of first-order formulas and weights, {(w i , f i )}, that serve as a template for constructing a Markov random field.  ... 
dblp:conf/ijcai/LowdD07 fatcat:nh42qclkfnfm3bpltistwknk6a

Design and Implementation of Voice Controlled Wheelchair using MATLAB

Karande Kaushal Balu, Somani Sakshi, Zope Jagruti Dilip, Bhusari Balu
2022 ITM Web of Conferences  
Our proposed model is based on neural networks and the backpropagation algorithm is used as a medium to train the Artificial neural networks which are trained by the user's voice command so it uses voice  ...  CONCLUSION This work tackled the speech recognition problem by applying the Backpropagation algorithm to neural networks.  ...  This input is fed to the neural networks where an LPC extractor is used to extract the feature. The network is trained using both feed-forward as well as a backpropagation method.  ... 
doi:10.1051/itmconf/20224401003 doaj:15b735bf91e248128140e34a612f62c0 fatcat:7cpq7bpwwfbnlo4grnkjhhunuq

Recurrent policy gradients

D. Wierstra, A. Forster, J. Peters, J. Schmidhuber
2009 Logic Journal of the IGPL  
The approach involves approximating a policy gradient for a recurrent neural network by backpropagating return-weighted characteristic eligibilities through time.  ...  Reinforcement learning for partially observable Markov decision problems (POMDPs) is a challenge as it requires policies with an internal state.  ...  Like conventional neural networks, they can be trained using a special variant of backpropagation, backpropagation through time (BPTT) [17, 22] .  ... 
doi:10.1093/jigpal/jzp049 fatcat:6hyjaw6labdahdmkl7a37xcdx4

Neural-network connection-admission control for ATM networks

R.-G. Cheng, C.-J. Chang
1997 IEE Proceedings - Communications  
NNCAC is suitable for designers who are not familiar with fuzzy-logic control schemes or have no ideas about the requisite knowledge of CAC.  ...  A neural-network connectionadmission control (NNCAC) method which can overcome these difficulties by preprocessing neural-network input parameters is proposed.  ...  Similarly, the process for h',(t) is an (M, + 1)-state birth-death Markov process.  ... 
doi:10.1049/ip-com:19971088 fatcat:756sgke5prf7jk5jwxapf2t7ma

Automatic Speaker Recognition using MFCC and Artificial Neural Network

2019 VOLUME-8 ISSUE-10, AUGUST 2019, REGULAR ISSUE  
In this paper, a new method is proposed for identifying the speaker using an artificial neural network.  ...  Using these extracted features value, input samples are then created and finally, classification is performed using Multilayer Perceptron (MLP) which is trained by backpropagation.  ...  Recognition with MLP Feedforward backpropagation The features thus obtained from MFCC will be used as input for recognition through our approach Multilayer perceptron(MLP) feedforward neural network.  ... 
doi:10.35940/ijitee.a1010.1191s19 fatcat:hsca4o5qtbe4fneh4h3e5yhyjm

Detection of Cardiomyopathy using Support Vector Machine and Artificial Neural Network

Rabiya Begum, Manza Ramesh
2016 International Journal of Computer Applications  
It's less able to pump blood through the body and maintain a normal electrical rhythm.  ...  applied for noise Cancellation and baseline correction then four time based features have been extracted and finally classification is been performed using Support vector machines and Artificial neural Networks  ...  Digital signal analysis, Fuzzy Logic methods, Artificial Neural Network, Hidden Markov Model, Genetic Algorithm, Support Vector Machines, Self-Organizing Map, Bayesian are various classification techniques  ... 
doi:10.5120/ijca2016908178 fatcat:7k5ewmyw3bag7fglxfl7vdz5jm

Sign Language Recognition System using Neural Network for Digital Hardware Implementation

Lorena P Vargas, Leiner Barba, C O Torres, L Mattos
2011 Journal of Physics, Conference Series  
This work presents an image pattern recognition system using neural network for the identification of sign language to deaf people.  ...  The system has several stored image that show the specific symbol in this kind of language, which is employed to teach a multilayer neural network using a back propagation algorithm.  ...  Neural Network Model A multilayer neural network was used in the design with a backpropagation algorithm.  ... 
doi:10.1088/1742-6596/274/1/012051 fatcat:7lb42elspndrtow3jq36qycpdq

Sequence learning: from recognition and prediction to sequential decision making

R. Sun, C.L. Giles
2001 IEEE Intelligent Systems  
Several neural network models deal with sequences; one example is recurrent backpropagation networks.  ...  Models might be in the form of Markov chains, hidden Markov models, recurrent neural networks, or a variety of other forms.  ... 
doi:10.1109/mis.2001.1463065 fatcat:safqkf2ovnanteqrgfueamn7mi

Ensemble Neural Network in Classifying Handwritten Arabic Numerals

Kathirvalavakumar Thangairulappan, Palaniappan Rathinasamy
2016 Journal of Intelligent Learning Systems and Applications  
Compressing the matrix representation by merging adjacent pair of rows using logical OR operation reduces its size in half.  ...  Leaders of clusters of partitions are used to recognize the patterns by Divide and Conquer approach using proposed ensemble neural network.  ...  Conclusion Novelty of this work is recognition of digit through ensemble neural network. Each digit is converted into matrix form and then compressed using logical OR operation.  ... 
doi:10.4236/jilsa.2016.81001 fatcat:gnozcdsnfjg4zdxrmsnr6jztf4

Space of Reasons and Mathematical Model [article]

Florian Richter
2020 arXiv   pre-print
network.  ...  The conceptual background of representation in models is discussed and in the end I propose how implications of propositional logic and conceptual determinations can be represented in a model of a neural  ...  [7] In neural networks the neurons are adjusted through backpropagation in order to bring the ouput closer to the desired output.  ... 
arXiv:2007.02489v1 fatcat:ochdtxsfczahnlpcnrlk54ivje

Evolution of an artificial visual cortex for image recognition

Samuel Chapman, David Knoester, Arend Hintze, Christoph Adami
2013 Advances in Artificial Life, ECAL 2013  
This work demonstrates that evolving logic circuits to solve a classification task is feasible.  ...  The logic circuits are encoded in a genome that is evolved using a fitness function based on the true positive and true negative classification rates of the numerals.  ...  Finally, because Markov networks represent logic circuits, they are capable of being rendered on physical hardware such as FPGAs.  ... 
doi:10.7551/978-0-262-31709-2-ch160 dblp:conf/ecal/ChapmanKHA13 fatcat:jpuhohpa4raetejcqvhpqr2zfm
« Previous Showing results 1 — 15 out of 2,481 results