Abductive Knowledge Induction From Raw Data [article]

Wang-Zhou Dai, Stephen H. Muggleton
2021 arXiv   pre-print
For many reasoning-heavy tasks involving raw inputs, it is challenging to design an appropriate end-to-end learning pipeline. Neuro-Symbolic Learning, divide the process into sub-symbolic perception and symbolic reasoning, trying to utilise data-driven machine learning and knowledge-driven reasoning simultaneously. However, they suffer from the exponential computational complexity within the interface between these two components, where the sub-symbolic learning model lacks direct supervision,
more » ... nd the symbolic model lacks accurate input facts. Hence, most of them assume the existence of a strong symbolic knowledge base and only learn the perception model while avoiding a crucial problem: where does the knowledge come from? In this paper, we present Abductive Meta-Interpretive Learning (Meta_Abd) that unites abduction and induction to learn neural networks and induce logic theories jointly from raw data. Experimental results demonstrate that Meta_Abd not only outperforms the compared systems in predictive accuracy and data efficiency but also induces logic programs that can be re-used as background knowledge in subsequent learning tasks. To the best of our knowledge, Meta_Abd is the first system that can jointly learn neural networks from scratch and induce recursive first-order logic theories with predicate invention.
arXiv:2010.03514v2 fatcat:6mtsp6ucvnafnme47sama6vuru