A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Inference in the FO(C) Modelling Language
[article]
2014
arXiv
pre-print
Up to this point, no systems exist that perform inference on FO(C), and very little is known about properties of inference in FO(C). In this paper, we study both of the above problems. ...
We implemented a prototype of this transformation, and thus present the first system to perform inference in FO(C). We also provide results about the complexity of reasoning in FO(C). ...
We call this integration FO(C). 1 FO(C) fits in the FO(·) research project (Denecker 2012) , which aims at integrating expressive language constructs with a Tarskian model semantics in a unified language ...
arXiv:1404.6368v1
fatcat:6v3vqex3rjhqboob75sayojot4
The L+C Plant-Modelling Language
[chapter]
2007
Functional-Structural Plant Modelling in Crop Production
We implemented this transformation and hence, created the first system that performs inference in FO(C). We also provide results about the complexity of reasoning in FO(C). ...
Up to this point, no systems exist that perform inference on FO(C), and very little is known about properties of inference in FO(C). In this paper, we study both of the above problems. ...
Bogaerts et al. / Inference in the FO(C) Modelling Language 113
Definition 3 . 12 . 312 Let Δ be a causal theory in NestNF and let C be one of the C i in definition 3.5, then we call ϕ i (again, from ...
doi:10.1007/1-4020-6034-3_3
fatcat:wjjuqnvl45bsrbbluee6io35qe
Predicate Logic as a Modelling Language: The IDP System
[article]
2018
arXiv
pre-print
In this paper, we present the language and system. ...
of inference. ...
FO(ID,AGG,PF,T), the Formal Base Language In this section, we introduce the logic that is the basis of the IDP language. ...
arXiv:1401.6312v3
fatcat:fldt2evvpvbufhgz4bp47pp2em
OAG-BERT: Pre-train Heterogeneous Entity-augmented Academic Language Models
[article]
2021
arXiv
pre-print
To enrich language models with domain knowledge is crucial but difficult. ...
Based on the world's largest public academic graph Open Academic Graph (OAG), we pre-train an academic language model, namely OAG-BERT, which integrates massive heterogeneous entities including paper, ...
2 , ..., |C), where is the entity length and is the -th token in the entity. ...
arXiv:2103.02410v2
fatcat:ba6za5nfnjawvovsl6bdvcdwoi
Boolean models and infinitary first order languages
1973
Annals of Mathematical Logic
The paper develops a systematic use of boolean models in the model theory of infinitary languages. This yields a notion of (boolean) saturated model for denumerable sublanguages of Lo~ l w. ...
The methods of saturated models are then applied; in particular results of "upward LiSwenheim-Skolem" type and results relating syntactic and semantic properties are obtained, ...
We say that we relativize the above notions to a predicate G if we replace in their definitions the language ~ by the language ~c which contains in addition the predicate G. ...
doi:10.1016/0003-4843(73)90003-x
fatcat:ktgilv3bhff63f7ubo2jfwa43q
Predicate logic as a modeling language: modeling and solving some machine learning and data mining problems with IDP3
2014
Theory and Practice of Logic Programming
It offers its users a modeling language that is a slight extension of predicate logic and allows them to solve a wide range of search problems. ...
These research areas have recently shown a strong interest in declarative modeling and constraint-solving as opposed to algorithmic approaches. ...
Acknowledgements Caroline Macé and Tara Andrews introduced some of the authors to stemmatology and provided the data sets; Tara also explained the working of the procedural code. ...
doi:10.1017/s147106841400009x
fatcat:txwnmq27w5abpjdbamvi6qmkm4
Fertility models for statistical natural language understanding
1997
Proceedings of the 35th annual meeting on Association for Computational Linguistics -
The basic underlying intuition is that a single concept may be expressed in English as many disjoint clump of words. We present two fertility models which attempt to capture this phenomenon. ...
Several recent efforts in statistical natural language understanding (NLU) have focused on generating clumps of English words from semantic meaning concepts (Miller et al., 1995; Levin and Pieraccini, ...
The views and conclusions contained in this document should not be interpreted as representing the official policies of the U.S. Government. ...
doi:10.3115/976909.979639
dblp:conf/acl/PietraERW97
fatcat:2yiubeuyxfcmfidpyyfbvqwxmy
Generating Training Data with Language Models: Towards Zero-Shot Language Understanding
[article]
2022
arXiv
pre-print
Pretrained language models (PLMs) have demonstrated remarkable performance in various natural language processing tasks: Unidirectional PLMs (e.g., GPT) are well known for their superior text generation ...
capabilities; bidirectional PLMs (e.g., BERT) have been the prominent choice for natural language understanding (NLU) tasks. ...
., 2020) , especially on challenging tasks like natural language inference (NLI). ...
arXiv:2202.04538v1
fatcat:eqlgcsotwre67pitwxqxjmru5e
Neural Random Projections for Language Modelling
[article]
2018
arXiv
pre-print
In this paper, we exploit the sparsity in natural language even further by encoding each unique input word using a fixed sparse random representation. ...
Neural network-based language models deal with data sparsity problems by mapping the large discrete space of words into a smaller continuous space of real-valued vectors. ...
The idea of using embeddings in language modelling is explored in the early work or Bengio et al. ...
arXiv:1807.00930v4
fatcat:jzmciwx73jdctiwkfxzok3p3ty
On the Sentence Embeddings from Pre-trained Language Models
[article]
2020
arXiv
pre-print
However, the sentence embeddings from the pre-trained language models without fine-tuning have been found to poorly capture semantic meaning of sentences. ...
We first reveal the theoretical connection between the masked language model pre-training objective and the semantic similarity task theoretically, and then analyze the BERT sentence embeddings empirically ...
Acknowledgments The authors would like to thank Jiangtao Feng, Wenxian Shi, Yuxuan Song, and anonymous reviewers for their helpful comments and suggestion on this paper. ...
arXiv:2011.05864v1
fatcat:q7appb75a5elfnofhz6cgjmmva
QA-GNN: Reasoning with Language Models and Knowledge Graphs for Question Answering
[article]
2021
arXiv
pre-print
We evaluate our model on QA benchmarks in the commonsense (CommonsenseQA, OpenBookQA) and biomedical (MedQA-USMLE) domains. ...
The problem of answering questions using knowledge from pre-trained language models (LMs) and knowledge graphs (KGs) presents two challenges: given a QA context (question and answer choice), methods need ...
Acknowledgment We thank Rok Sosic, Weihua Hu, Jing Huang, Michele Catasta, members of the Stanford SNAP, P-Lambda and NLP groups and Project MOWGLI team, as well as our anonymous reviewers for valuable ...
arXiv:2104.06378v4
fatcat:qytacnflg5d2lj7blaw2ueo5x4
textTOvec: Deep Contextualized Neural Autoregressive Topic Models of Language with Distributed Compositional Prior
[article]
2019
arXiv
pre-print
We address two challenges of probabilistic topic modelling in order to better estimate the probability of a word in a given context, i.e., P(word|context): (1) No Language Structure in Context: Probabilistic ...
The LSTM-LM learns a vector-space representation of each word by accounting for word order in local collocation patterns and models complex characteristics of language (e.g., syntax and semantics), while ...
The LSTM offers history for the ith word via modeling temporal dependencies in the input sequence, c i . ...
arXiv:1810.03947v4
fatcat:4mobcineg5bu5c266bonq4n5xa
The Genesis of Spanish /θ/: A Revised Model
2022
Languages
This article proposes a revised model of the genesis of Castilian Spanish /θ/, based on (i) precise tracking across the Late Middle Ages of the orthographical d → z change in preconsonantal coda position ...
This effectively inverts the normally assumed chronology, according to which devoicing preceded and indeed was implicated in the genesis of /θ/. ...
For example, prior 'kiss' would have been distinguishe articulation, the ç of beç o being voic conventional chronology is poorly associated teleological explanation fo basis. ...
doi:10.3390/languages7030191
fatcat:66quzzuw5jdsth3ve4gv3up3wy
The Status of Information Processing Models of Language
1981
Philosophical Transactions of the Royal Society of London. Biological Sciences
An introductio n is given to the n atu re of inform ation processing models in psychology. ...
of language and for phenom ena of m em ory for language m aterials over short time intervals. ...
N ote th a t the fo rm at itself can n o t be falsified. ...
doi:10.1098/rstb.1981.0147
fatcat:e4i46usqznfvhoj23ddrn7hsqm
On Tree-Based Neural Sentence Modeling
2018
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
To study the effectiveness of different tree structures, we replace the parsing trees with trivial trees (i.e., binary balanced tree, left-branching tree and right-branching tree) in the encoders. ...
Further analysis show that tree modeling gives better results when crucial words are closer to the final representation. ...
Acknowledgements We thank Hang Li, Yue Zhang, Lili Mou and Jiayuan Mao for their helpful comments on this work, and the anonymous reviewers for their valuable feedback. ...
doi:10.18653/v1/d18-1492
dblp:conf/emnlp/ShiZCL18
fatcat:a2hiqrn7szce7d4azhwu3egkyy
« Previous
Showing results 1 — 15 out of 16,749 results