22,593 Hits in 7.3 sec

A bottom-up efficient algorithm learning substitutable languages from positive examples

François Coste, Gaëlle Garet, Jacques Nicolas
2014 International Conference on Grammatical Inference  
We present here an efficient learning algorithm, motivated by intelligibility and parsing efficiency of the result, which directly reduces the positive sample into a small non-redundant canonical grammar  ...  by a generalization based on local substitutability.  ...  Acknowledgments This work benefited from the support of the French Government run by the National Research Agency and with regards to the investment expenditure programme IDEALG ANR-10-BTBR-04.  ... 
dblp:conf/icgi/CosteGN14 fatcat:l4hzr7rh5nc2dl6skjfzfkm5vq

On Learning Constraint Problems

Arnaud Lallouet, Matthieu Lopez, Lionel Martin, Christel Vrain
2010 2010 22nd IEEE International Conference on Tools with Artificial Intelligence  
On this particular relational learning problem, traditional topdown search methods fall into blind search and bottom-up search methods produce too expensive coverage tests.  ...  The model is expressed in a middle-level modeling language.  ...  The algorithm iterates a single rule learning algorithm until positive examples are correctly discriminated from all negative ones.  ... 
doi:10.1109/ictai.2010.16 dblp:conf/ictai/LallouetLMV10 fatcat:vrvdkjnplbc4znicihlctqex7e

Learning local substitutable context-free languages from positive examples in polynomial time and data by reduction

François Coste, Jacques Nicolas
2018 International Conference on Grammatical Inference  
This enables us to show that local substitutable languages represented by RNF context-free grammars are identifiable in polynomial time and thick data (IPTtD) from positive examples.  ...  To study more formally the approach by reduction initiated by ReGLiS, we propose a formal characterization of the grammars in reduced normal form (RNF) which can be learned by this approach.  ...  Efficient reduction We detail here how k, l-local substitutable languages are learnt efficiently by reduction from a training sample S in Algorithm 1.  ... 
dblp:conf/icgi/CosteN18 fatcat:c67wcycl6vg4rcfi2nevx52744

Stochastic Propositionalization for Efficient Multi-relational Learning [chapter]

N. Di Mauro, T. M. A. Basile, S. Ferilli, F. Esposito
2008 Lecture Notes in Computer Science  
We propose a population based algorithm that using a stochastic propositional method efficiently learns complete FOL definitions.  ...  and on the coverage test assessing the validity of the learned theory against the training examples.  ...  The proposed method is a population based algorithm that stochastically propositionalizes the training examples in which the learning phase may be viewed as a bottom-up search in the hypotheses space.  ... 
doi:10.1007/978-3-540-68123-6_8 fatcat:kzy3x5jsmrc4vfkzgglhad3zeq

ProGolem: A System Based on Relative Minimal Generalisation [chapter]

Stephen Muggleton, José Santos, Alireza Tamaddoni-Nezhad
2010 Lecture Notes in Computer Science  
In this paper we re-examine the use of bottom-up approaches to the construction of logic programs.  ...  An algorithm is described for constructing ARMGs and this has been implemented in an ILP system called ProGolem which combines bottom-clause construction in Progol with a Golem control strategy which uses  ...  The second author was supported by a Wellcome Trust Ph.D. scholarship. The third author was supported by the BBSRC grant BB/C519670/1.  ... 
doi:10.1007/978-3-642-13840-9_13 fatcat:4bv7lufphzfghepie46xgqqtmy

Phase transition and heuristic search in relational learning

Erick Alphonse, Aomar Osmani
2007 Sixth International Conference on Machine Learning and Applications (ICMLA 2007)  
It is argued that this phase transition dooms every learning algorithm to fail to identify a target concept lying close to it.  ...  However, in this paper we exhibit a counter-example which shows that this conclusion must be qualified in the general case.  ...  For efficiency reasons, and without loss of generality, we do not choose a positive example as seed.  ... 
doi:10.1109/icmla.2007.102 dblp:conf/icmla/AlphonseO07 fatcat:kqt2aveudjb7nkqvzoftt2r6ve

Usable Scalable Learning Over Relational Data With Automatic Language Bias [article]

Jose Picado, Arash Termehchy, Sudhanshu Pathak, Alan Fern, Praveen Ilango, Yunqiao Cai
2020 arXiv   pre-print
We show that AutoBias delivers the same accuracy as using manually-written language bias by imposing only a slight overhead on the running time of the learning algorithm.  ...  In order to constraint the search through the large space of candidate definitions, users must tune the algorithm by specifying a language bias.  ...  In this section, we explain this algorithm and how it uses language bias to learn efficiently.  ... 
arXiv:1710.01420v2 fatcat:bz3ipajcu5b5ncywor5o3xqfym

Meta-interpretive learning: application to grammatical inference

Stephen H. Muggleton, Dianhuan Lin, Niels Pahlavi, Alireza Tamaddoni-Nezhad
2013 Machine Learning  
We show that the approach is sufficiently flexible to support learning of Context-Free grammars from positive and negative example, a problem shown to be theoretically possible by E.M.  ...  Once again, Metagol RCF runs up to 100 times faster than Metagol CF on grammars chosen randomly from Regular and non-Regular Context-Free grammars.  ...  By contrast, since the 1950s automaton-based learning algorithms have existed [11] which inductively infer Regular languages, such as Parity, from positive and negative examples.  ... 
doi:10.1007/s10994-013-5358-3 fatcat:xokqzageqbfx3g2evyw33lys4m

Empirical Study of Relational Learning Algorithms in the Phase Transition Framework [chapter]

Erick Alphonse, Aomar Osmani
2009 Lecture Notes in Computer Science  
Relational Learning (RL) has aroused interest to fill the gap between efficient attribute-value learners and growing applications stored in multi-relational databases.  ...  able to solve real-world applications up to millions of variables.  ...  Background Relational Learning In machine learning, we are given a learning set E = E + ∪ E − , with positive and negative examples of the unknown target concept, drawn from an example language L e ,  ... 
doi:10.1007/978-3-642-04180-8_21 fatcat:mnylhzpnpnf3xlqidvpi27ca6a

Learning in Clausal Logic: A Perspective on Inductive Logic Programming [chapter]

Peter Flach, Nada Lavrač
2002 Lecture Notes in Computer Science  
Inductive logic programming is a form of machine learning from examples which employs the representation formalism of clausal logic.  ...  On the other hand, significant advances have been made regarding dealing with noisy data, efficient heuristic and stochastic search methods, the use of logical representations going beyond definite clauses  ...  Part of the material in Sections 2-4 is based on a tutorial given by the first author at the First International Conference on Computational Logic (CL-2000).  ... 
doi:10.1007/3-540-45628-7_17 fatcat:i24bt6wv65crjjfybhcm7uwda4

Learning Over Dirty Data Without Cleaning [article]

Jose Picado, John Davis, Arash Termehchy, Ga Young Lee
2020 arXiv   pre-print
We propose DLearn, a novel relational learning system that learns directly over dirty databases effectively and efficiently without any preprocessing.  ...  Our empirical study indicates that DLearn learns accurate models over large real-world databases efficiently.  ...  It follows the approach used in the bottom-up relational learning algorithms [42, [44] [45] [46] . In this approach, the LearnClause function in Algorithm 1 has two steps.  ... 
arXiv:2004.02308v1 fatcat:j6qvo573dngt7by66cjaoss2su

Schema Independent Relational Learning [article]

Jose Picado, Arash Termehchy, Alan Fern, Parisa Ataei
2017 arXiv   pre-print
We study both sample-based learning algorithms, which learn from sets of labeled examples, and query-based algorithms, which learn by asking queries to an oracle.  ...  Relational learning algorithms learn the definition of a new relation in terms of existing relations in the database.  ...  Furthermore, a relational learning algorithm that learns only safe clauses can learn a definition from positive examples only.  ... 
arXiv:1508.03846v2 fatcat:kmvaoup7lzbhvhlsbug2ophh4q

Bottom-Up Learning of Logic Programs for Information Extraction from Hypertext Documents [chapter]

Bernd Thomas
2003 Lecture Notes in Computer Science  
BFOIL learns from positive examples only and uses a logical representation for hypertext documents based on the document object model (DOM).  ...  We present an inductive logic programming bottom-up learning algorithm (BFOIL) for synthesizing logic programs for multi-slot information extraction from hypertext documents.  ...  Furthermore let us assume that a logic program P L H is given that imple- BFOIL Algorithm The central idea of BFOIL is to learn in a bottom-up fashion from positive examples only a set of rules by means  ... 
doi:10.1007/978-3-540-39804-2_39 fatcat:s7caqfpxovd7fnlcqmx7zg2bv4


Johannes Fürnkranz
2012 Artificial Intelligence Review  
This paper is a survey of inductive rule learning algorithms that use a separate-andconquer strategy.  ...  We will put this wide variety of algorithms into a single framework and analyze them along three different dimensions, namely their search, language and overfitting avoidance biases.  ...  There are only a few propositional bottom-up separate-and-conquer learning algorithms.  ... 
doi:10.1023/a:1006524209794 fatcat:v4ax4yqgvvbd7fxzj25fayflfq

Complexity in Language Acquisition

Alexander Clark, Shalom Lappin
2013 Topics in Cognitive Science  
Second, we claim that the real problem for language learning is the computational complexity of constructing a hypothesis from input data.  ...  We argue first that these arguments, and analyses based on them suffer from a major flaw: they systematically conflate the hypothesis class and the learnable concept class.  ...  For a less artifical example, consider the simple learner described in Clark and Eyraud (2007) . This learns the class of substitutable context-free grammars (CFGs) from positive data alone.  ... 
doi:10.1111/tops.12001 pmid:23335575 fatcat:dzq6bckz6rfofek6bio5aifbxe
« Previous Showing results 1 — 15 out of 22,593 results