A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2016; you can also visit the original URL.
The file type is application/pdf
.
Filters
Regularized Least-Squares for Parse Ranking
[chapter]
2005
Lecture Notes in Computer Science
We present an adaptation of the Regularized Least-Squares algorithm for the rank learning problem and an application of the method to reranking of the parses produced by the Link Grammar (LG) dependency ...
Using a parse goodness function based on the F-score, we demonstrate that our method produces a statistically significant increase in rank correlation from 0.18 to 0.42 compared to the built-in ranking ...
Acknowledgments This work has been supported by Tekes, the Finnish National Technology Agency and we also thank CSC, the Finnish IT center for science for providing us extensive computing resources. ...
doi:10.1007/11552253_42
fatcat:o3xogl443neuln7uridcz6jeii
Locality-Convolution Kernel and Its Application to Dependency Parse Ranking
[chapter]
2006
Lecture Notes in Computer Science
We applied the introduced kernel together with Regularized Least-Squares (RLS) algorithm to a dataset containing dependency parses obtained from a manually annotated biomedical corpus of 1100 sentences ...
We propose a Locality-Convolution (LC) kernel in application to dependency parse ranking. ...
Acknowledgments We would like to thank CSC, the Finnish IT center for science, for providing us extensive computing resources. ...
doi:10.1007/11779568_66
fatcat:n7e7ijbsxfdj7iq3xp76456bpa
Online Co-regularized Algorithms
[chapter]
2012
Lecture Notes in Computer Science
We propose an online co-regularized learning algorithm for classification and regression tasks. ...
The presented algorithm is particularly applicable to learning tasks where large amounts of (unlabeled) data are available for training. ...
We obtain support vector machines [11] by choosing a hinge loss function and we obtain regularized least-squares (RLS) [12] by choosing a squared loss function. ...
doi:10.1007/978-3-642-33492-4_16
fatcat:flirzx2pejfpdgd65pcs5bsqme
Selecting Feature Sets for Comparative and Time-Oriented Quality Estimation of Machine Translation Output
2013
Conference on Machine Translation
Sentence-level ranking of alternative MT outputs is done with pairwise classifiers using Logistic Regression with blackbox features originating from PCFG Parsing, language models and various counts. ...
Hans Uszkoreit for the supervision, Dr. Aljoscha Burchardt, and Dr. David Vilar for their useful feedback and to Lukas Poustka for his technical help on feature acquisition. ...
Acknowledgments This work has been developed within the TaraX Ű project, financed by TSB Technologiestiftung Berlin -Zukunftsfonds Berlin, co-financed by the European Union -European fund for regional ...
dblp:conf/wmt/AvramidisP13
fatcat:pfkor7dy4feq7ic4jzy5lquxbu
An efficient algorithm for learning to rank from preference graphs
2009
Machine Learning
In this paper, we introduce a framework for regularized least-squares (RLS) type of ranking cost functions and we propose three such cost functions. ...
Further, we propose a kernel-based preference learning algorithm, which we call RankRLS, for minimizing these functions. ...
We would like to thank CSC, the Finnish IT center for science, for providing us extensive computing resources. ...
doi:10.1007/s10994-008-5097-z
fatcat:ubmy3tzivne63lyvezeypg2grq
A Comparative Study of Parameter Estimation Methods for Statistical Natural Language Processing
2007
Annual Meeting of the Association for Computational Linguistics
We first investigate all of our estimators on two re-ranking tasks: a parse selection task and a language model (LM) adaptation task. ...
Our experiments show that across tasks, three of the estimators -ME estimation with L 1 or L 2 regularization, and APare in a near statistical tie for first place. ...
Riezler and Vasserman (2004) showed that an L 1 -regularized ME estimator outperformed an L 2 -regularized estimator for ranking the parses of a stochastic unification-based grammar. ...
dblp:conf/acl/GaoAJT07
fatcat:wkd6dgqsovbohmjlxac4tyuyjy
Geometry-Aware Face Completion and Editing
2019
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
Besides, since low-rank property exists in manually labeled masks, a low-rank regularization term is imposed on the disentangled masks, enforcing our completion network to manage occlusion area with various ...
Firstly, a facial geometry estimator is learned to estimate facial landmark heatmaps and parsing maps from the unmasked face image. ...
Acknowledgement This work is funded by National Natural Science Foundation of China (Grants No. 61622310), Youth Innovation Promotion Association CAS(2015190) and Innovation Training Programs for Undergraduates ...
doi:10.1609/aaai.v33i01.33012506
fatcat:ffsliuwhprbypemyomldoifma4
Geometry-Aware Face Completion and Editing
[article]
2019
arXiv
pre-print
Besides, since low-rank property exists in manually labeled masks, a low-rank regularization term is imposed on the disentangled masks, enforcing our completion network to manage occlusion area with various ...
Firstly, a facial geometry estimator is learned to estimate facial landmark heatmaps and parsing maps from the unmasked face image. ...
Acknowledgement This work is funded by National Natural Science Foundation of China (Grants No. 61622310), Youth Innovation Promotion Association CAS(2015190) and Innovation Training Programs for Undergraduates ...
arXiv:1809.02967v2
fatcat:hckil2zspnftvlsemzauieecvy
LS-Tree: Model Interpretation When the Data Are Linguistic
[article]
2019
arXiv
pre-print
Leveraging a parse tree, we propose to assign least-squares based importance scores to each word of an instance by exploiting syntactic constituency structure. ...
We demonstrate that the proposed method can aid in interpretability and diagnostics for several widely-used language models. ...
least squares. ...
arXiv:1902.04187v1
fatcat:i2ftqk4mgvhdhlq2ghuokhb2wu
LS-Tree: Model Interpretation When the Data Are Linguistic
2020
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
Leveraging a parse tree, we propose to assign least-squares-based importance scores to each word of an instance by exploiting syntactic constituency structure. ...
We demonstrate that the proposed method can aid in interpretability and diagnostics for several widely-used language models. ...
Least squares on parse trees For simplicity, we restrict ourselves to classification. Assume a model maps a sentence to a vector of class probabilities. ...
doi:10.1609/aaai.v34i04.5749
fatcat:ej22yuedindehiim44yd7ohqhy
Email Surveillance Using Non-negative Matrix Factorization
2005
Computational and mathematical organization theory
For the publicly released Enron electronic mail collection, we encode sparse term-by-message matrices and use a low rank non-negative matrix factorization algorithm to preserve natural data non-negativity ...
In this study, we apply a non-negative matrix factorization approach for the extraction and detection of concepts or topics from electronic mail messages. ...
The numerical approach for solving the constrained least squares problem in Step 2(c) for the columns H j of H makes use of an algorithm similar to one described in [20] for regularized least squares ...
doi:10.1007/s10588-005-5380-5
fatcat:cnqh7ijwujhzldfxmw3ytgxmda
Computing Optimal Descriptions for Optimality Theory Grammars with Context-Free Position Structures
[article]
1996
arXiv
pre-print
This paper describes an algorithm for computing optimal structural descriptions for Optimality Theory grammars with context-free position structures. ...
This algorithm extends Tesar's dynamic programming approach [Tesar 1994][Tesar 1995] to computing optimal structural descriptions from regular to context-free structures. ...
are cell categories without repeating at least one cell category, thereby creating a cycle. ...
arXiv:cmp-lg/9606020v1
fatcat:bnefxx52obcylj7zwavn6aj6ku
Computing optimal descriptions for Optimality Theory grammars with context-free position structures
1996
Proceedings of the 34th annual meeting on Association for Computational Linguistics -
This paper describes an algorithm for computing optimal structural descriptions for Optimality Theory grammars with context-free position structures. ...
This algorithm extends Tesar's dynamic programming approach (Tesar, 1994) (Tesar, 1995@ to computing optimal structural descriptions from regular to context-free structures. ...
are cell categories without repeating at least one cell category, thereby creating a cycle. ...
doi:10.3115/981863.981877
dblp:conf/acl/Tesar96
fatcat:n43hdktnzfbrvnelzrglkng7da
OT SIMPLE - a construction-kit approach to Optimality Theory implementation
[article]
1996
arXiv
pre-print
This approach gave rise to OT SIMPLE, the first freely available software tool for the OT framework to provide generic facilities for both GEN and CONstraint definition. ...
This motivates additional constraints Fill and Parse-Feature ranked above Son]pl. ...
If underparsed coronal segments are adjacent to a parsed coronal segment, this corresponds to a scenario where at least one surviving association line would still extend from the parsed segment on the ...
arXiv:cmp-lg/9611001v1
fatcat:ocubi43lkreddb26waqc5htagm
Fast LR parsing using rich (Tree Adjoining) Grammars
2002
Proceedings of the ACL-02 conference on Empirical methods in natural language processing - EMNLP '02
We evaluate the parser using the Penn Treebank showing that the method yield very fast parsers with at least reasonable accuracy, confirming the intuition that LR parsing benefits from the use of rich ...
We describe an LR parser of parts-ofspeech (and punctuation labels) for Tree Adjoining Grammars (TAGs), that solves table conflicts in a greedy way, with limited amount of backtracking. ...
to achieve reasonable parsing accuracy. (2) LR parsing allows for very fast parsing with at least reasonable accuracy. ...
doi:10.3115/1118693.1118707
dblp:conf/emnlp/Prolo02
fatcat:nbr35666enaylkoyx624a6x2m4
« Previous
Showing results 1 — 15 out of 10,014 results