Filters








607 Hits in 4.8 sec

Some improvements of the spectral learning approach for probabilistic grammatical inference

Mattias Gybels, François Denis, Amaury Habrard
2014 International Conference on Grammatical Inference  
Spectral methods propose new and elegant solutions in probabilistic grammatical inference. We propose two ways to improve them.  ...  We show how a linear representation, or equivalently a weighted automata, output by the spectral learning algorithm can be taken as an initial point for the Baum Welch algorithm, in order to increase the  ...  The spectral learning scheme for probabilistic grammatical inference consists then in building the empirical Hankel matrix H S from S, performing a singular value decomposition (SVD) of H S and building  ... 
dblp:conf/icgi/GybelsDH14 fatcat:pp5qtngldndp3aw2pywus4b3n4

Introduction to the Special Issue on Grammatical Inference

Jeffrey Heinz, C. de la Higuera, Tim Oates
2013 Machine Learning  
As a field, Grammatical Inference addresses both theoretical and empirical learning problems, and the collection of papers within this special issue attests both to the diversity of these problems as well  ...  Thus we hope this special issue is of interest to the readership of Machine Learning.  ...  Grammatical Inference is typically concerned with learning from batches of strings, the problem of learning from data streams introduces a new set of questions.  ... 
doi:10.1007/s10994-013-5428-6 fatcat:b5ehf6qurvcp5nhex4bjw7w3vm

Spectral learning of weighted automata

Borja Balle, Xavier Carreras, Franco M. Luque, Ariadna Quattoni
2013 Machine Learning  
One such class of methods are the so-called spectral algorithms that measure learning complexity in terms of the smallest singular value of some Hankel matrix.  ...  One of the goals of this paper is to remedy this situation by presenting a derivation of the spectral method for learning WFA that-without sacrificing rigor and mathematical elegance-puts emphasis on providing  ...  Acknowledgements We are grateful to the anonymous reviewers for providing us with helpful comments. This work was supported by a Google Research Award, and by projects XLike (FP7-288342), BASMATI  ... 
doi:10.1007/s10994-013-5416-x fatcat:gdkrhg3qpvcvzchkbwuw62j6ja

Backdoors in Neural Models of Source Code [article]

Goutham Ramakrishnan, Aws Albarghouthi
2020 arXiv   pre-print
and improve recent algorithms from robust statistics for our setting, showing that backdoors leave a spectral signature in the learned representation of source code, thus enabling detection of poisoned  ...  We study backdoors in the context of deep-learning for source code. (1) We define a range of backdoor classes for source-code tasks and show how to poison a dataset to install such backdoors. (2) We adapt  ...  The same expression e and string s are inserted into all poisoned elements. Second, grammatical triggers add pieces of dead code drawn randomly from some probabilistic grammar.  ... 
arXiv:2006.06841v1 fatcat:itdhfdeg2fgnzgk432uz6qlhwu

Statistical methods in language processing

Steven Abney
2010 Wiley Interdisciplinary Reviews: Cognitive Science  
It is characterized by the use of stochastic models, substantial data sets, machine learning, and rigorous experimental evaluation.  ...  There has, however, been little penetration of the methods into general linguistics. The methods themselves are largely borrowed from machine learning and information theory.  ...  We treat classification and semisupervised learning here, and grammatical inference, a variety of unsupervised learning, arises in the following sections.  ... 
doi:10.1002/wcs.111 pmid:26302079 fatcat:qnockuwjdzagxjgjqev36kwsee

Flexible State-Merging for Learning (P)DFAs in Python

Christian A. Hammerschmidt, Benjamin Loos, Radu State, Thomas Engel
2016 International Conference on Grammatical Inference  
We present a Python package for learning (non-)probabilistic deterministic finite state automata and provide heuristics in the red-blue framework.  ...  It provides PDFA learning as an additional tool for sequence prediction or classification to data scientists, without the need to understand the algorithm itself but rather the limitations of PDFA as a  ...  Acknowledgments This work was partially funded by the FNR AFR-PPP grant PAULINE, and STW VENI project 13136 MANTA and NWO project 62001628 LEMMA.  ... 
dblp:conf/icgi/HammerschmidtLS16 fatcat:62dr56ytsrgaxhzerwnfpae3ru

PAutomaC: a probabilistic automata and hidden Markov models learning competition

Sicco Verwer, Rémi Eyraud, Colin de la Higuera
2013 Machine Learning  
The Probabilistic Automata learning Competition (PAutomaC), run in 2012, was the first grammatical inference challenge that allowed the comparison between these methods and algorithms.  ...  Its main goal was to provide an overview of the state-of-the-art techniques for this hard learning problem.  ...  Acknowledgements We are very thankful to the members of the scientific committee for their help in designing this competition.  ... 
doi:10.1007/s10994-013-5409-9 fatcat:ey3ghvxqxzbevmpzxmv5tytvay

FlexFringe: Modeling Software Behavior by Learning Probabilistic Automata [article]

Sicco Verwer, Christian Hammerschmidt
2022 arXiv   pre-print
We present the efficient implementations of probabilistic deterministic finite automaton learning methods available in FlexFringe.  ...  Although less interpretable, we show that learning smaller more convoluted models improves the performance of FlexFringe on anomaly detection, outperforming an existing solution based on neural nets.  ...  In the grammatical inference community, there has been much research into improving merging in the red-blue framework.  ... 
arXiv:2203.16331v1 fatcat:jauvzhv63ze5loco5couv3ylse

Neural Network Based Nonlinear Weighted Finite Automata [article]

Tianyu Li, Guillaume Rabusseau, Doina Precup
2017 arXiv   pre-print
Our learning algorithm is inspired by the spectral learning algorithm for WFAand relies on a nonlinear decomposition of the so-called Hankel matrix, by means of an auto-encoder network.  ...  The expressive power of NL-WFA and the proposed learning algorithm are assessed on both synthetic and real-world data, showing that NL-WFA can lead to smaller model sizes and infer complex grammatical  ...  In order to give some insights and further motivate our approach, we will first show how the spectral method can be interpreted as a representation learning scheme.  ... 
arXiv:1709.04380v2 fatcat:2dgllrtzurbrfd3cz62kokusri

Review of: Speech and language processing

Sheila Garfield
2001 Cognitive Systems Research  
1 apart from some typographical errors. In summary, Kaplan, R. M., & Kay, M. (1994). Regular models of phonologithe book is highly recommended for all involved in cal rule systems.  ...  notion of spectral features, the State Transducers.  ...  The ideas outlined cover discussion of machine learning approaches namely meaning representations using formal structures; feature vectors, supervised learning, bootstrapping meaning representation languages  ... 
doi:10.1016/s1389-0417(01)00022-5 fatcat:cxemu3sqbjfmlnk5wlhrpspguq

Editors' Introduction [chapter]

Sanjay Jain, Rémi Munos, Frank Stephan, Thomas Zeugmann
2013 Lecture Notes in Computer Science  
In his invited talk Towards General Algorithms for Grammatical Inference, Alexander Clark deals with the learning of context-free languages and multiple context-free languages.  ...  approaches.  ...  In their paper A Spectral Approach for Probabilistic Grammatical Inference of Trees, Raphaël Bally, François Denis and Amaury Habrard consider distributions over the set of trees which are computed by  ... 
doi:10.1007/978-3-642-40935-6_1 fatcat:pchrsvhjezfbvh6dfplqhxhgcy

Learning Probability Distributions Generated by Finite-State Machines [chapter]

Jorge Castro, Ricard Gavaldà
2016 Topics in Grammatical Inference  
We review methods for inference of probability distributions generated by probabilistic automata and related models for sequence generation.  ...  The methods we review are state merging and state splitting methods for probabilistic deterministic automata and the recently developed spectral method for nondeterministic probabilistic automata.  ...  We thank the chairs of ICGI 2012 for the invitation to present a preliminary version of this work as tutorial. We particularly thank the reviewer of this version for a thorough and useful work.  ... 
doi:10.1007/978-3-662-48395-4_5 fatcat:u4cepbpghjcv7ct6zoqrgir2cy

Evaluation of Machine Learning Methods on SPiCe

Ichinari Sato, Kaizaburo Chubachi, Diptarama
2016 International Conference on Grammatical Inference  
The experiment results show that XGBoost and neural network approaches have good performance overall.  ...  In this paper, we introduce methods that we used to solve problems from the sequence prediction competition called SPiCe.  ...  p is a probabilistic distribution used to predict a next symbol for the prefix.  ... 
dblp:conf/icgi/SatoCD16 fatcat:jmap6ztzdzakppghklt3oqmegm

Automatic recognition and understanding of spoken language - a first step toward natural human-machine communication

Bing-Hwang Juang, S. Furui
2000 Proceedings of the IEEE  
Statistical methods are designed to allow the machine to learn, directly from data, structure regularities in the speech signal for the purpose of automatic speech recognition and understanding.  ...  In this paper, we summarize the development of the spoken language technology from both a vertical (the chronology) and a horizontal (the spectrum of technical approaches) perspective.  ...  Speech waveform has various kinds of features; for example, some pertain to the gender of the speaker, some relate to the quality of the sound, and some carry the necessary information for the intended  ... 
doi:10.1109/5.880077 fatcat:6ca4ebtwcbg4tl6bgcvgtr2gry

Inductive Logic and Empirical Psychology [chapter]

Nick Chater, Mike Oaksford, Ulrike Hahn, Evan Heit
2011 Handbook of the History of Logic  
ACKNOWLEDGEMENTS Acknowledgements: Nick Chater is supported by a Senior Research Fellowship from the Leverhulme Trust, and the ESRC Centre for Economic Learning and Social Evolution (ELSE).  ...  We briefly here consider some of the many concerns that may be raised against probabilistic approaches.  ...  Consequently, the inference to the conclusion Some P are R is probabilistically valid (p-valid).  ... 
doi:10.1016/b978-0-444-52936-7.50014-8 fatcat:ex776zoztrd63nva3rjuadzfje
« Previous Showing results 1 — 15 out of 607 results