20,048 Hits in 4.3 sec

Learning to predict non-deterministically generated strings

Moshe Koppel
1991 Machine Learning  
In this article we present an algorithm that learns to predict non-deterministically generated strings.  ...  The problem of learning to predict non-deterministically generated strings was raised by Dietterich and Michalski (1986) .  ...  Introduction In order to illustrate what we mean by learning non-deterministically generated strings we consider the case of language acquisition.  ... 
doi:10.1007/bf00058927 fatcat:p2piauq2lraqhfxxetda3adjgm

PAutomaC: a probabilistic automata and hidden Markov models learning competition

Sicco Verwer, Rémi Eyraud, Colin de la Higuera
2013 Machine Learning  
Approximating distributions over strings is a hard learning problem.  ...  Both artificial data and real data were presented and contestants were to try to estimate the probabilities of strings.  ...  Acknowledgements We are very thankful to the members of the scientific committee for their help in designing this competition.  ... 
doi:10.1007/s10994-013-5409-9 fatcat:ey3ghvxqxzbevmpzxmv5tytvay

Rapid on-line temporal sequence prediction by an adaptive agent

Steven Jensen, Daniel Boley, Maria Gini, Paul Schrater
2005 Proceedings of the fourth international joint conference on Autonomous agents and multiagent systems - AAMAS '05  
We consider the case of near-future event prediction by an online learning agent operating in a non-stationary environment.  ...  The method compares well against Markov-chain predictions, and adapts faster than learned Markov-chain models to changes in the underlying distribution of events.  ...  This method yielded non-stationary strings in which highly deterministic sections from one process were followed by highly deterministic sections from a different process.  ... 
doi:10.1145/1082473.1082484 dblp:conf/atal/JensenBGS05 fatcat:inib7hsksbeepp2rt2jhpxnb4u

Results of the PAutomaC Probabilistic Automaton Learning Competition

Sicco Verwer, Rémi Eyraud, Colin de la Higuera
2012 Journal of machine learning research  
Approximating distributions over strings is a hard learning problem.  ...  Both artificial data and real data were proposed and contestants were to try to estimate the probabilities of test strings.  ...  Acknowledgments We are very thankful to the members of the scientific committee for their help in designing this competition.  ... 
dblp:journals/jmlr/VerwerEH12 fatcat:orcojpoilveyliyoaeteagyo6e

Predictability of imitative learning trajectories

Paulo R A Campos, José F Fontanari
2019 Journal of Statistical Mechanics: Theory and Experiment  
The learning trajectories become more deterministic, in the sense that there are fewer distinct trajectories and those trajectories are more similar to each other, with increasing population size and imitation  ...  We assess the degree to which the starting and ending points determine the learning trajectories using two measures, namely, the predictability that yields the probability that two randomly chosen trajectories  ...  In the evolutionary algorithms, however, there is no such a natural choice: the fittest string at a given generation is more likely to contribute offsprings to the the next generation but does not have  ... 
doi:10.1088/1742-5468/aaf634 fatcat:6py7fcwbzja7dni4o26bor4lhe

Ten Open Problems in Grammatical Inference [chapter]

Colin de la Higuera
2006 Lecture Notes in Computer Science  
They cover the areas of polynomial learning models, learning from ordered alphabets, learning deterministic Pomdps, learning negotiation processes, learning from context-free background knowledge.  ...  In all cases, problems are theoretically oriented but correspond to practical questions.  ...  Acknowledgements Thanks to Jose Oncina for different discussions that led to several definitions and problems from sections 4 and 6.  ... 
doi:10.1007/11872436_4 fatcat:5snm4lpumbhw5e7ffh5vwxzhum

Universal Learning Theory [article]

Marcus Hutter
2011 arXiv   pre-print
It explains the spirit of universal learning, but necessarily glosses over technical subtleties.  ...  This encyclopedic article gives a mini-introduction into the theory of universal learning, founded by Ray Solomonoff in the 1960s and significantly developed and extended in the last decade.  ...  One solution is to take into account our (whole) scientific prior knowledge z [Hut06] , and predicting the now long string zx leads to good (less sensitive to "reasonable" U) predictions.  ... 
arXiv:1102.2467v1 fatcat:m6voura42jcmvknemk7cbb7qf4

Strongly Unambiguous Büchi Automata Are Polynomially Predictable With Membership Queries

Dana Angluin, Timos Antonopoulos, Dana Fisman, Michael Wagner
2020 Annual Conference for Computer Science Logic  
In contrast, under plausible cryptographic assumptions, non-deterministic Büchi automata are not polynomially predictable with membership queries.  ...  using a non-deterministic Büchi automaton (Theorem 1), it is polynomially predictable with membership queries when the target language is represented using a strongly unambiguous Büchi automaton (Corollary  ...  Finally, when A requests the test word to predict, we request the test word to predict, and receive a string x ∈ Σ * , chosen according to D.  ... 
doi:10.4230/lipics.csl.2020.8 dblp:conf/csl/AngluinAF20 fatcat:gqrbomby4jcytg57umnctwqdzi

Pseudo-Derandomizing Learning and Approximation

Igor Carboni Oliveira, Rahul Santhanam, Michael Wagner
2018 International Workshop on Approximation Algorithms for Combinatorial Optimization  
Our goal is to simulate known randomized algorithms in these settings by pseudo-deterministic algorithms in a generic fashion -a goal we succinctly term pseudo-derandomization. Learning.  ...  In particular, this suggests a new approach to constructing hitting set generators against AC 0 [p] circuits by giving a deterministic learning algorithm for AC 0 [p]. Approximation.  ...  Acknowledgements We thank Chris Brzuska for bringing [3] to our attention, Roei Tell for helpful discussions, and the reviewers for comments that improved the presentation.  ... 
doi:10.4230/lipics.approx-random.2018.55 dblp:conf/approx/OliveiraS18 fatcat:gjphuxvvubakxevkekyacqxryq

Interpolated Spectral NGram Language Models

Ariadna Quattoni, Xavier Carreras
2019 Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics  
In this work we employ a technique for scaling up spectral learning, and use interpolated predictions that are optimized to maximize perplexity.  ...  The second is that the loss function behind spectral learning, based on moment matching, differs from the probabilistic metrics used to evaluate language models.  ...  Acknowledgments We are grateful to Matthias Gallé for the discussions around this work, as well as to the anonymous reviewers for their useful feedback.  ... 
doi:10.18653/v1/p19-1594 dblp:conf/acl/QuattoniC19 fatcat:umjd54ixobcalfga6he45bypmu

Efficiency in the Identification in the Limit Learning Paradigm [chapter]

Rémi Eyraud, Jeffrey Heinz, Ryo Yoshinaka
2016 Topics in Grammatical Inference  
Such models provide a framework to study the behavior of learning algorithms and to formally establish their soundness.  ...  On the other hand, a theoretical approach is possible by using a learning paradigm, which is an attempt to formalize what learning means.  ...  We extend this order to non-empty finite sets of strings: S 1 ¡ S 2 iff S 1 < S 2 or S 1 = S 2 and ∃w ∈ S 1 − S 2 such that ∀w ∈ S 2 either w ∈ S 1 or w ¡ w .  ... 
doi:10.1007/978-3-662-48395-4_2 fatcat:6wfzms6wy5dhjaglpxccvczqg4

Results of the Sequence PredIction ChallengE (SPiCe): a Competition on Learning the Next Symbol in a Sequence

Borja Balle, Rémi Eyraud, Franco M. Luque, Ariadna Quattoni, Sicco Verwer
2016 International Conference on Grammatical Inference  
The aim was to submit a ranking of the 5 most probable symbols to be the next symbol of each prefix.  ...  The Sequence PredIction ChallengE (SPiCe) is an on-line competition that took place between March and July 2016.  ...  Evaluation Metrics The SPiCe competition focuses on the ability of the learned models to predict the next symbol in a string.  ... 
dblp:conf/icgi/BalleELQV16 fatcat:oevhzjg63jeslizr5os4bfikkm

Position Models and Language Modeling [chapter]

Arnaud Zdziobeck, Franck Thollard
2008 Lecture Notes in Computer Science  
We propose here to improve the use of this model by restricting the dependency to a more reasonable value.  ...  This model is not able however to capture long term dependencies, i.e. dependencies larger than n. An alternative to this model is the probabilistic automaton.  ...  On the contrary to the non probabilistic case, non-deterministic automata have a greater power of expression than the deterministic one.  ... 
doi:10.1007/978-3-540-89689-0_12 fatcat:perwz65gsrbmflpk6hsnyi7o6i

Spectral learning of weighted automata

Borja Balle, Xavier Carreras, Franco M. Luque, Ariadna Quattoni
2013 Machine Learning  
In addition, our algorithm overcomes some of the shortcomings of previous work and is able to learn from statistics of substrings.  ...  Most of these algorithms avoid the known hardness results by defining parameters beyond the number of states that can be used to quantify the complexity of learning automata under a particular distribution  ...  Acknowledgements We are grateful to the anonymous reviewers for providing us with helpful comments. This work was supported by a Google Research Award, and by projects XLike (FP7-288342), BASMATI  ... 
doi:10.1007/s10994-013-5416-x fatcat:gdkrhg3qpvcvzchkbwuw62j6ja

A Revision of Coding Theory for Learning from Language

2004 Electronical Notes in Theoretical Computer Science  
A differentiation 1 [1] has shown that Zipf's law is met at least by strings of independently tossed letters and spaces. [19] reports on change in the law's exponent from −1 to −3 for ranks ≈ 10 4 , which  ...  Generating the full non-finitary process is a different task than short-term predicting it for the discrete classification only [33] .  ...  How to classify the future and generate the full process? Short-term prediction capabilities are necessary for improving discrete linguistic classification of non-discrete acoustic percepts [27] .  ... 
doi:10.1016/s1571-0661(05)82574-5 fatcat:t4r6hquy6zc43ew5v7hx5eucey
« Previous Showing results 1 — 15 out of 20,048 results