A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Deep Adaptive Semantic Logic (DASL): Compiling Declarative Knowledge into Deep Neural Networks
[article]
2020
arXiv
pre-print
We introduce Deep Adaptive Semantic Logic (DASL), a novel framework for automating the generation of deep neural networks that incorporates user-provided formal knowledge to improve learning from data. ...
We then evaluate DASL on a visual relationship detection task and demonstrate that the addition of commonsense knowledge improves performance by 10.7% in a data scarce setting. ...
We demonstrated a 1000-fold reduction in data requirements on the MNIST digit classification task by using declarative knowledge in the form of arithmetic relation satisfied by unlabeled image triplets ...
arXiv:2003.07344v1
fatcat:h3qpmxsodvg53l3zq2you3wvym
A Probabilistic Model for Discriminative and Neuro-Symbolic Semi-Supervised Learning
[article]
2021
arXiv
pre-print
We extend the discriminative model to neuro-symbolic SSL, where label features satisfy logical rules, by showing such rules relate directly to the above prior, thus justifying a family of methods that ...
link statistical learning and logical reasoning, and unifying them with regular SSL. ...
Carl Allen and Ivana Balažević were supported by the Centre for Doctoral Training in Data Science, funded by EPSRC (grant EP/L016427/1) and the University of Edinburgh. ...
arXiv:2006.05896v4
fatcat:uw67pbn6qbc33efiswif52wzdy
Learning from Explanations with Neural Execution Tree
[article]
2020
arXiv
pre-print
While deep neural networks have achieved impressive performance on a range of NLP tasks, these data-hungry models heavily rely on labeled data, which restricts their applications in scenarios where data ...
After transforming NL explanations into executable logical forms by semantic parsing, NExT generalizes different types of actions specified by the logical forms for labeling data instances, which substantially ...
However, the method is very limited in that it is not able to use unlabeled data. ...
arXiv:1911.01352v3
fatcat:fqbntowaj5addj4h3o6fasbykm
Spam Filtering based on Knowledge Transfer Learning
2015
International Journal of Security and Its Applications
The method uses the unlabeled spam data that from other user or domain to enhance the adaptive and opposability of the anti-spam system. ...
A transfer learning model can use the untagged data, and migrate knowledge between different filter model, and improve the active collaboration of the filter. ...
The method uses the unlabeled spam data that from other user or domain to enhance the adaptive and opposability of the anti-spam system. ...
doi:10.14257/ijsia.2015.9.10.31
fatcat:woxtawexdfddnpo5necjptsbdq
Semi-Supervised Online Structure Learning for Composite Event Recognition
2019
Zenodo
In order to adapt graph-cut minimisation to first order logic, we employ a suitable structural distance for measuring the distance between sets of logical atoms. ...
Online structure learning approaches, such as those stemming from Statistical Relational Learning, enable the discovery of complex relations in noisy data streams. ...
We would also like to thank Nikos Katzouris for providing assistance on the distance functions for first-order logic and helping us running OLED. ...
doi:10.5281/zenodo.2541609
fatcat:c66mizfs7vhyrallglux2isguq
Semi-supervised online structure learning for composite event recognition
2019
Machine Learning
In order to adapt graph-cut minimisation to first order logic, we employ a suitable structural distance for measuring the distance between sets of logical atoms. ...
Online structure learning approaches, such as those stemming from statistical relational learning, enable the discovery of complex relations in noisy data streams. ...
We would also like to thank Nikos Katzouris for providing assistance on the distance functions for first-order logic and helping us running OLED. ...
doi:10.1007/s10994-019-05794-2
fatcat:3r4wpagf45babnlemh67ob772y
Semi-Supervised Online Structure Learning for Composite Event Recognition
[article]
2018
arXiv
pre-print
In order to adapt graph-cut minimisation to first order logic, we employ a suitable structural distance for measuring the distance between sets of logical atoms. ...
Online structure learning approaches, such as those stemming from Statistical Relational Learning, enable the discovery of complex relations in noisy data streams. ...
We would also like to thank Nikos Katzouris for providing assistance on the distance functions for first-order logic and helping us running OLED. ...
arXiv:1803.00546v1
fatcat:iay7pahrmnet7exyvbwbiulzwy
Semantic Parsing with Dual Learning
2019
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
In this work, we develop a semantic parsing framework with the dual learning algorithm, which enables a semantic parser to make full use of data (labeled and even unlabeled) through a dual-learning game ...
This game between a primal model (semantic parsing) and a dual model (logical form to query) forces them to regularize each other, and can achieve feedback signals from some prior-knowledge. ...
Training and decoding We individually pre-train Q2LF /LF 2Q models using only labeled data and language model LM q using both labeled and unlabeled queries. ...
doi:10.18653/v1/p19-1007
dblp:conf/acl/CaoZLLY19
fatcat:tbsrr24ij5exphjctkmwwwf4l4
DAPPER: Performance Estimation of Domain Adaptation in Mobile Sensing
[article]
2021
arXiv
pre-print
We present DAPPER (Domain AdaPtation Performance EstimatoR) that estimates the adaptation performance in a target domain with only unlabeled target data. ...
Our intuition is that the outputs of a model on the target data provide clues for the model's actual performance in the target domain. ...
We used this proxy risk to calculate an estimated accuracy. For the check model, we used the same model as the adaptation model and used the unlabeled target validation data to train. ...
arXiv:2111.11053v1
fatcat:yhgdozpuqbd45bfzqwbuqk6ume
Improving context-aware query classification via adaptive self-training
2011
Proceedings of the 20th ACM international conference on Information and knowledge management - CIKM '11
We then adapt self-training with our model to exploit the information in unlabeled queries. ...
We first incorporate search contexts into our framework using a Conditional Random Field (CRF) model. ...
It uses labeled data to learn logical relationships between query terms and query categories, and unlabeled data to fully exploit the "syntactic" dependencies between query terms, so that more queries ...
doi:10.1145/2063576.2063598
dblp:conf/cikm/ChenSNC11
fatcat:6yvgsjbxcbf57jbkml4eq43ary
Transfer learning using computational intelligence: A survey
2015
Knowledge-Based Systems
In contrast to classical machine learning methods, transfer learning methods exploit the knowledge accumulated from data in auxiliary domains to facilitate predictive modeling consisting of different data ...
This paper systematically examines computational intelligence-based transfer learning techniques and clusters related technique developments into four main categories: a) neural network-based transfer ...
adaptation model was developed based on the use of a hierarchical Bayes prior [86] . ...
doi:10.1016/j.knosys.2015.01.010
fatcat:vu2ttscic5fq3nm2tkdid4fs64
ProbAnch: a Modular Probabilistic Anchoring Framework
2020
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
A practical approach that aims to model such representations is perceptual anchoring, which handles the problem of mapping sub-symbolic sensor data to symbols and maintains these mappings over time. ...
Modeling object representations derived from perceptual observations, in a way that is also semantically meaningful for humans as well as autonomous agents, is a prerequisite for joint human-agent understanding ...
Occluded objects are tracked via their relationship with observed objects using logical rules, i.e., the position of an occluded object is logically inferred through the position of the occluding ob- ject ...
doi:10.24963/ijcai.2020/771
dblp:conf/ijcai/PerssonMRL20
fatcat:cyfidvsyr5dmliakr5v4penkkq
StructVAE: Tree-structured Latent Variable Models for Semi-supervised Semantic Parsing
2018
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
STRUCTVAE models latent MRs not observed in the unlabeled data as treestructured latent variables. ...
Experiments on semantic parsing on the ATIS domain and Python code generation show that with extra unlabeled data, STRUCTVAE outperforms strong supervised models. 1 ...
lacking enough unlabeled data. ...
doi:10.18653/v1/p18-1070
dblp:conf/acl/NeubigZYH18
fatcat:tfvfebqokve3dhszzhvreexkdu
A Survey on Computational Intelligence-based Transfer Learning
[article]
2022
arXiv
pre-print
Transfer learning approaches compared to traditional machine learning approaches are capable of modeling better data patterns from the current domain. ...
However, vanilla TL needs performance improvements by using computational intelligence-based TL. ...
Their framework learns target tasks using a fuzzy inference system using limited unlabeled target data and labeled source data. In the framework no or little unlabeled data is available. ...
arXiv:2206.10593v1
fatcat:n4bofmrgs5eidciu6b3p3gcxey
Refining Language Models with Compositional Explanations
[article]
2021
arXiv
pre-print
By parsing these explanations into executable logic rules, the human-specified refinement advice from a small set of explanations can be generalized to more training examples. ...
Further, the model is regularized to align the importance scores with human knowledge, so that the unintended model behaviors are eliminated. ...
In another relevant thread, unsupervised domain adaptation (UDA) looks to adapt a trained model to a new domain using unlabeled data from the target domain, by updating feature representations to minimize ...
arXiv:2103.10415v3
fatcat:g7qy7gpevzc2zbees74zt4folm
« Previous
Showing results 1 — 15 out of 11,723 results